916 resultados para digital terrain analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines a complete design framework for a real-time, autonomous system with specialized VLSI hardware for computing 3-D camera motion. In the proposed architecture, the first step is to determine point correspondences between two images. Two processors, a CCD array edge detector and a mixed analog/digital binary block correlator, are proposed for this task. The report is divided into three parts. Part I covers the algorithmic analysis; part II describes the design and test of a 32$\time $32 CCD edge detector fabricated through MOSIS; and part III compares the design of the mixed analog/digital correlator to a fully digital implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The creative industries sector faces a constantly changing context characterised by the speed of the development and deployment of digital information systems and Information Communications Technologies (ICT) on a global scale. This continuous digital disruption has had significant impact on the whole value chain of the sector: creation and production; discovery and distribution; and consumption of cultural goods and services. As a result, creative enterprises must evolve business and operational models and practices to be sustainable. Enterprises of all scales, type, and operational model are affected, and all sectors face ongoing digital disruption. Management consultancy practitioners and business strategy academics have called for new strategy development frameworks and toolkits, fit for a continuously changing world. This thesis investigates a novel approach to organisational change appropriate to the digital age, in the context of the creative sector in Scotland. A set of concepts, methods, tools, and processes to generate theoretical learning and practical knowing was created to support enterprises to digitally adapt through undertaking journeys of change and organisational development. The framework is called The AmbITion Approach. It was developed by blending participatory action research (PAR) methods and modern management consultancy, design, and creative practices. Empirical work also introduced to the framework Coghlan and Rashford’s change categories. These enabled the definition and description of the extent to which organisations developed: whether they experienced first order (change), second order (adaptation) or third order (transformation) change. Digital research tools for inquiry were tested by a pilot study, and then embedded in a longitudinal study over two years of twentyone participant organisations from Scotland’s creative sector. The author applied and investigated the novel approach in a national digital development programme for Scotland’s creative industries. The programme was designed and delivered by the author and ran nationally between 2012-14. Detailed grounded thematic analysis of the data corpus was undertaken, along with analysis of rich media case studies produced by the organisations about their change journeys. The results of studies on participants, and validation criteria applied to the results, demonstrated that the framework triggers second (adaptation) and third order change (transformation) in creative industry enterprises. The AmbITion Approach framework is suitable for the continuing landscape of digital disruption within the creative sector. The thesis contributes to practice the concepts, methods, tools, and processes of The AmbITion Approach, which have been empirically tested in the field, and validated as a new framework for business transformation in a digital age. The thesis contributes to knowledge a theoretical and conceptual framework with a specific set of constructs and criteria that define first, second, and third order change in creative enterprises, and a robust research and action framework for the analysis of the quality, validity and change achieved by action research based development programmes. The thesis additionally contributes to the practice of research, adding to our understanding of the value of PAR and design thinking approaches and creative practices as methods for change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis will examine the interaction between the user and the digital archive. The aim of the study is to support an in-depth examination of the interaction process, with a view to making recommendations and tools, for system designers and archival professionals, to promote digital archive domain development. Following a comprehensive literature review process, an urgent requirement for models was identified. The Model of Contextual Interaction presented in this thesis, aims to provide a conceptual model through which the interaction process, between the user and the digital archive, can be examined. Using the five-phased research development framework, the study will present a structured account of its methods, using a multi-method methodology to ensuring robust data collection and analysis. The findings of the study are presented across the Model of Contextual Interaction, and provide a basis on which recommendations and tools for system designers have been made. The thesis concludes with a summary of key findings, and a reflective account of how the findings and the Model of Contextual Interaction have impacted digital provision within the archive domain and how the model could be applied to other domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a by-product of the ‘information revolution’ which is currently unfolding, lifetimes of man (and indeed computer) hours are being allocated for the automated and intelligent interpretation of data. This is particularly true in medical and clinical settings, where research into machine-assisted diagnosis of physiological conditions gains momentum daily. Of the conditions which have been addressed, however, automated classification of allergy has not been investigated, even though the numbers of allergic persons are rising, and undiagnosed allergies are most likely to elicit fatal consequences. On the basis of the observations of allergists who conduct oral food challenges (OFCs), activity-based analyses of allergy tests were performed. Algorithms were investigated and validated by a pilot study which verified that accelerometer-based inquiry of human movements is particularly well-suited for objective appraisal of activity. However, when these analyses were applied to OFCs, accelerometer-based investigations were found to provide very poor separation between allergic and non-allergic persons, and it was concluded that the avenues explored in this thesis are inadequate for the classification of allergy. Heart rate variability (HRV) analysis is known to provide very significant diagnostic information for many conditions. Owing to this, electrocardiograms (ECGs) were recorded during OFCs for the purpose of assessing the effect that allergy induces on HRV features. It was found that with appropriate analysis, excellent separation between allergic and nonallergic subjects can be obtained. These results were, however, obtained with manual QRS annotations, and these are not a viable methodology for real-time diagnostic applications. Even so, this was the first work which has categorically correlated changes in HRV features to the onset of allergic events, and manual annotations yield undeniable affirmation of this. Fostered by the successful results which were obtained with manual classifications, automatic QRS detection algorithms were investigated to facilitate the fully automated classification of allergy. The results which were obtained by this process are very promising. Most importantly, the work that is presented in this thesis did not obtain any false positive classifications. This is a most desirable result for OFC classification, as it allows complete confidence to be attributed to classifications of allergy. Furthermore, these results could be particularly advantageous in clinical settings, as machine-based classification can detect the onset of allergy which can allow for early termination of OFCs. Consequently, machine-based monitoring of OFCs has in this work been shown to possess the capacity to significantly and safely advance the current state of clinical art of allergy diagnosis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overall objective of this thesis is to integrate a number of micro/nanotechnologies into integrated cartridge type systems to implement such biochemical protocols. Instrumentation and systems were developed to interface such cartridge systems: (i) implementing microfluidic handling, (ii) executing thermal control during biochemical protocols and (iii) detection of biomolecules associated with inherited or infectious disease. This system implements biochemical protocols for DNA extraction, amplification and detection. A digital microfluidic chip (ElectroWetting on Dielectric) manipulated droplets of sample and reagent implementing sample preparation protocols. The cartridge system also integrated a planar magnetic microcoil device to generate local magnetic field gradients, manipulating magnetic beads. For hybridisation detection a fluorescence microarray, screening for mutations associated with CFTR gene is printed on a waveguide surface and integrated within the cartridge. A second cartridge system was developed to implement amplification and detection screening for DNA associated with disease-causing pathogens e.g. Escherichia coli. This system incorporates (i) elastomeric pinch valves isolating liquids during biochemical protocols and (ii) a silver nanoparticle microarray for fluorescent signal enhancement, using localized surface plasmon resonance. The microfluidic structures facilitated the sample and reagent to be loaded and moved between chambers with external heaters implementing thermal steps for nucleic acid amplification and detection. In a technique allowing probe DNA to be immobilised within a microfluidic system using (3D) hydrogel structures a prepolymer solution containing probe DNA was formulated and introduced into the microfluidic channel. Photo-polymerisation was undertaken forming 3D hydrogel structures attached to the microfluidic channel surface. The prepolymer material, poly-ethyleneglycol (PEG), was used to form hydrogel structures containing probe DNA. This hydrogel formulation process was fast compared to conventional biomolecule immobilization techniques and was also biocompatible with the immobilised biomolecules, as verified by on-chip hybridisation assays. This process allowed control over hydrogel height growth at the micron scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phase-locked loops (PLLs) are a crucial component in modern communications systems. Comprising of a phase-detector, linear filter, and controllable oscillator, they are widely used in radio receivers to retrieve the information content from remote signals. As such, they are capable of signal demodulation, phase and carrier recovery, frequency synthesis, and clock synchronization. Continuous-time PLLs are a mature area of study, and have been covered in the literature since the early classical work by Viterbi [1] in the 1950s. With the rise of computing in recent decades, discrete-time digital PLLs (DPLLs) are a more recent discipline; most of the literature published dates from the 1990s onwards. Gardner [2] is a pioneer in this area. It is our aim in this work to address the difficulties encountered by Gardner [3] in his investigation of the DPLL output phase-jitter where additive noise to the input signal is combined with frequency quantization in the local oscillator. The model we use in our novel analysis of the system is also applicable to another of the cases looked at by Gardner, that is the DPLL with a delay element integrated in the loop. This gives us the opportunity to look at this system in more detail, our analysis providing some unique insights into the variance `dip' seen by Gardner in [3]. We initially provide background on the probability theory and stochastic processes. These branches of mathematics are the basis for the study of noisy analogue and digital PLLs. We give an overview of the classical analogue PLL theory as well as the background on both the digital PLL and circle map, referencing the model proposed by Teplinsky et al. [4, 5]. For our novel work, the case of the combined frequency quantization and noisy input from [3] is investigated first numerically, and then analytically as a Markov chain via its Chapman-Kolmogorov equation. The resulting delay equation for the steady-state jitter distribution is treated using two separate asymptotic analyses to obtain approximate solutions. It is shown how the variance obtained in each case matches well to the numerical results. Other properties of the output jitter, such as the mean, are also investigated. In this way, we arrive at a more complete understanding of the interaction between quantization and input noise in the first order DPLL than is possible using simulation alone. We also do an asymptotic analysis of a particular case of the noisy first-order DPLL with delay, previously investigated by Gardner [3]. We show a unique feature of the simulation results, namely the variance `dip' seen for certain levels of input noise, is explained by this analysis. Finally, we look at the second-order DPLL with additive noise, using numerical simulations to see the effects of low levels of noise on the limit cycles. We show how these effects are similar to those seen in the noise-free loop with non-zero initial conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Leaving Certificate (LC) is the national, standardised state examination in Ireland necessary for entry to third level education – this presents a massive, raw corpus of data with the potential to yield invaluable insight into the phenomena of learner interlanguage. With samples of official LC Spanish examination data, this project has compiled a digitised corpus of learner Spanish comprised of the written and oral production of 100 candidates. This corpus was then analysed using a specific investigative corpus technique, Computer-aided Error Analysis (CEA, Dagneaux et al, 1998). CEA is a powerful apparatus in that it greatly facilitates the quantification and analysis of a large learner corpus in digital format. The corpus was both compiled and analysed with the use of UAM Corpus Tool (O’Donnell 2013). This Tool allows for the recording of candidate-specific variables such as grade, examination level, task type and gender, therefore allowing for critical analysis of the corpus as one unit, as separate written and oral sub corpora and also of performance per task, level and gender. This is an interdisciplinary work combining aspects of Applied Linguistics, Learner Corpus Research and Foreign Language (FL) Learning. Beginning with a review of the context of FL learning in Ireland and Europe, I go on to discuss the disciplinary context and theoretical framework for this work and outline the methodology applied. I then perform detailed quantitative and qualitative analyses before going on to combine all research findings outlining principal conclusions. This investigation does not make a priori assumptions about the data set, the LC Spanish examination, the context of FLs or of any aspect of learner competence. It undertakes to provide the linguistic research community and the domain of Spanish language learning and pedagogy in Ireland with an empirical, descriptive profile of real learner performance, characterising learner difficulty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brazilian composer Heitor Villa-Lobos (1887-1959) began his musical career as a cellist. When he was only twelve years old, it became imperativeupon the sudden and untimely death of his fatherthat the young Villa-Lobos earn money as a cellist to provide financial support for his mother and sisters. Villa-Lobos's intimate relationship with the cello eventually inspired him to compose great music for this instrument. This dissertation explores both the diversity of compositional technique and the evolution of style found in the music for cello written by Villa-Lobos. The project consists of two recorded recital performances and a written document exploring and analyzing those pieces. In the study of the music of Villa-Lobos, it is of great interest to consider the music's traditional European elements in combination (or even juxtaposition) with its imaginative and sometimes wildly innovative Brazilian character. His early works were greatly influenced by European Romantic composers such as Robert Schumann, Frédéric Chopin, and the virtuoso cellist/composer David Popper (whom Villa-Lobos idolized). Later, Villa-Lobos flourished in a newfound compositional independence and moved away from Euro-romanticism and toward the folk music of his Brazilian homeland. It is intriguing to experience this transition through an exploration of his cello compositions. The works examined and performed in this dissertation project are chosen from among the extensive number of Villa-Lobos's cello compositions and are his most important works for cello with piano, cello with another instrument, and cello with orchestra. The chosen works demonstrate the evolving range and combination of characteristic elements found in Villa-Lobos's compositional repertoire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of digital microfluidic lab-on-a-chip (LoC) technology offers a platform for developing diagnostic applications with the advantages of portability, reduction of the volumes of the sample and reagents, faster analysis times, increased automation, low power consumption, compatibility with mass manufacturing, and high throughput. Moreover, digital microfluidics is being applied in other areas such as airborne chemical detection, DNA sequencing by synthesis, and tissue engineering. In most diagnostic and chemical-detection applications, a key challenge is the preparation of the analyte for presentation to the on-chip detection system. Thus, in diagnostics, raw physiological samples must be introduced onto the chip and then further processed by lysing blood cells and extracting DNA. For massively parallel DNA sequencing, sample preparation can be performed off chip, but the synthesis steps must be performed in a sequential on-chip format by automated control of buffers and nucleotides to extend the read lengths of DNA fragments. In airborne particulate-sampling applications, the sample collection from an air stream must be integrated into the LoC analytical component, which requires a collection droplet to scan an exposed impacted surface after its introduction into a closed analytical section. Finally, in tissue-engineering applications, the challenge for LoC technology is to build high-resolution (less than 10 microns) 3D tissue constructs with embedded cells and growth factors by manipulating and maintaining live cells in the chip platform. This article discusses these applications and their implementation in digital-microfluidic LoC platforms. © 2007 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gemstone Team WAVES (Water and Versatile Energy Systems)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to manipulate small fluid droplets, colloidal particles and single cells with the precision and parallelization of modern-day computer hardware has profound applications for biochemical detection, gene sequencing, chemical synthesis and highly parallel analysis of single cells. Drawing inspiration from general circuit theory and magnetic bubble technology, here we demonstrate a class of integrated circuits for executing sequential and parallel, timed operations on an ensemble of single particles and cells. The integrated circuits are constructed from lithographically defined, overlaid patterns of magnetic film and current lines. The magnetic patterns passively control particles similar to electrical conductors, diodes and capacitors. The current lines actively switch particles between different tracks similar to gated electrical transistors. When combined into arrays and driven by a rotating magnetic field clock, these integrated circuits have general multiplexing properties and enable the precise control of magnetizable objects.