563 resultados para Immersed boundary methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speaker diarization is the process of annotating an input audio with information that attributes temporal regions of the audio signal to their respective sources, which may include both speech and non-speech events. For speech regions, the diarization system also specifies the locations of speaker boundaries and assign relative speaker labels to each homogeneous segment of speech. In short, speaker diarization systems effectively answer the question of ‘who spoke when’. There are several important applications for speaker diarization technology, such as facilitating speaker indexing systems to allow users to directly access the relevant segments of interest within a given audio, and assisting with other downstream processes such as summarizing and parsing. When combined with automatic speech recognition (ASR) systems, the metadata extracted from a speaker diarization system can provide complementary information for ASR transcripts including the location of speaker turns and relative speaker segment labels, making the transcripts more readable. Speaker diarization output can also be used to localize the instances of specific speakers to pool data for model adaptation, which in turn boosts transcription accuracies. Speaker diarization therefore plays an important role as a preliminary step in automatic transcription of audio data. The aim of this work is to improve the usefulness and practicality of speaker diarization technology, through the reduction of diarization error rates. In particular, this research is focused on the segmentation and clustering stages within a diarization system. Although particular emphasis is placed on the broadcast news audio domain and systems developed throughout this work are also trained and tested on broadcast news data, the techniques proposed in this dissertation are also applicable to other domains including telephone conversations and meetings audio. Three main research themes were pursued: heuristic rules for speaker segmentation, modelling uncertainty in speaker model estimates, and modelling uncertainty in eigenvoice speaker modelling. The use of heuristic approaches for the speaker segmentation task was first investigated, with emphasis placed on minimizing missed boundary detections. A set of heuristic rules was proposed, to govern the detection and heuristic selection of candidate speaker segment boundaries. A second pass, using the same heuristic algorithm with a smaller window, was also proposed with the aim of improving detection of boundaries around short speaker segments. Compared to single threshold based methods, the proposed heuristic approach was shown to provide improved segmentation performance, leading to a reduction in the overall diarization error rate. Methods to model the uncertainty in speaker model estimates were developed, to address the difficulties associated with making segmentation and clustering decisions with limited data in the speaker segments. The Bayes factor, derived specifically for multivariate Gaussian speaker modelling, was introduced to account for the uncertainty of the speaker model estimates. The use of the Bayes factor also enabled the incorporation of prior information regarding the audio to aid segmentation and clustering decisions. The idea of modelling uncertainty in speaker model estimates was also extended to the eigenvoice speaker modelling framework for the speaker clustering task. Building on the application of Bayesian approaches to the speaker diarization problem, the proposed approach takes into account the uncertainty associated with the explicit estimation of the speaker factors. The proposed decision criteria, based on Bayesian theory, was shown to generally outperform their non- Bayesian counterparts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores what we are calling “Guerrilla Research Tactics” (GRT): research methods that exploit emerging mobile and cloud based digital technologies. We examine some case studies in the use of this technology to generate research data directly from the physical fabric and the people of the city. We argue that GRT is a new and novel way of engaging public participation in urban, place based research because it facilitates the co- creation of knowledge, with city inhabitants, ‘on the fly’. This paper discusses the potential of these new research techniques and what they have to offer researchers operating in the creative disciplines and beyond. This work builds on and extends Gauntlett’s “new creative methods” (2007) and contributes to the existing body of literature addressing creative and interactive approaches to data collection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Now as in earlier periods of acute change in the media environment, new disciplinary articulations are producing new methods for media and communication research. At the same time, established media and communication studies meth- ods are being recombined, reconfigured, and remediated alongside their objects of study. This special issue of JOBEM seeks to explore the conceptual, political, and practical aspects of emerging methods for digital media research. It does so at the conjuncture of a number of important contemporary trends: the rise of a ‘‘third wave’’ of the Digital Humanities and the ‘‘computational turn’’ (Berry, 2011) associated with natively digital objects and the methods for studying them; the apparently ubiquitous Big Data paradigm—with its various manifestations across academia, business, and government — that brings with it a rapidly increasing interest in social media communication and online ‘‘behavior’’ from the ‘‘hard’’ sciences; along with the multisited, embodied, and emplaced nature of everyday digital media practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transport processes within heterogeneous media may exhibit non-classical diffusion or dispersion; that is, not adequately described by the classical theory of Brownian motion and Fick's law. We consider a space fractional advection-dispersion equation based on a fractional Fick's law. The equation involves the Riemann-Liouville fractional derivative which arises from assuming that particles may make large jumps. Finite difference methods for solving this equation have been proposed by Meerschaert and Tadjeran. In the variable coefficient case, the product rule is first applied, and then the Riemann-Liouville fractional derivatives are discretised using standard and shifted Grunwald formulas, depending on the fractional order. In this work, we consider a finite volume method that deals directly with the equation in conservative form. Fractionally-shifted Grunwald formulas are used to discretise the fractional derivatives at control volume faces. We compare the two methods for several case studies from the literature, highlighting the convenience of the finite volume approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the spectral approximations are used to compute the fractional integral and the Caputo derivative. The effective recursive formulae based on the Legendre, Chebyshev and Jacobi polynomials are developed to approximate the fractional integral. And the succinct scheme for approximating the Caputo derivative is also derived. The collocation method is proposed to solve the fractional initial value problems and boundary value problems. Numerical examples are also provided to illustrate the effectiveness of the derived methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rayleigh–Stokes problems have in recent years received much attention due to their importance in physics. In this article, we focus on the variable-order Rayleigh–Stokes problem for a heated generalized second grade fluid with fractional derivative. Implicit and explicit numerical methods are developed to solve the problem. The convergence, stability of the numerical methods and solvability of the implicit numerical method are discussed via Fourier analysis. Moreover, a numerical example is given and the results support the effectiveness of the theoretical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure to cold air, whole body cryotherapy (WBC), is a novel treatment employed by athletes. In WBC individuals dressed in minimal clothing are exposed to a temperature below -100°C for 2-4 min. The use of WBC has been advocated as a treatment for various knee injuries. PURPOSE: To compare the effects of two modalities of cryotherapy, -110°C WBC and 8°C cold water immersion (CWI) on knee skin temperature (Tsk). METHODS: With ethical approval and written informed consent 10 healthy active male participants (26.5±4.9 yr, 183.5±6.0 cm, 90.7±19.9 kg, 26.8±5.0 kg/m2, 23.0±9.3% body fat (measured by DXA), 7.6 ± 2.0 mm patellar skin fold; mean±SD) were exposed to 4 min of CWI and WBC. The treatment order was randomised in a controlled crossover design, with a minimum of 7 days between treatments. During WBC participants stood in a chamber (-60±3°C) for 20 s before entering the main chamber (-110°C±3°C) where they remained for 3 min and 40 s. For CWI participants were seated in a tank filled with cold water (8±0.3°C) and immersed to the level of the sternum for 4 min. Right knee Tsk was assessed via non-contact, infrared thermal imaging. A quadrilateral region of interest was created using inert markers placed 5 cm above and below the most superior and inferior aspect of the patella. Tsk within this quadrilateral was recorded pre, immediately post and every 10 min thereafter for 60 min. Tsk changes were examined using a two-way (treatment x time) repeated measures analyses of variance. In addition, a paired sample t-test was used to compare baseline Tsk before both treatments. RESULTS: Knee Tsk was similar before treatment (WBC: 29.9±0.7°C, CWI: 29.6±0.9°C, p>0.05). There was a significant main effect for treatment (p<0.05) and time (p<0.001). Compared to baseline, Tsk was significantly reduced (p<0.05) immediately post and at 10, 20, 30, 40, 50 and 60 min after both cooling modalities. Knee Tsk was lower (p<0.05) immediately after WBC (19.0±0.9°C) compared to CWI (20.5±0.6°C). However, from 10 to 60 min post, knee Tsk was lower (p<0.05) following the CWI treatment. CONCLUSION: WBC elicited a greater decrease in knee Tsk compared to CWI immediately after treatment. However, both modalities display different recovery patterns and Tsk after CWI was significantly lower than WBC at 10, 20, 30, 40, 50 and 60 min after treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spreadsheet for Creative City Index 2012

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two–dimensional barrier assays describing the collective spreading of an initially–confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1–5% of the maximum cell density.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parametric and generative modelling methods are ways in which computer models are made more flexible, and of formalising domain-specific knowledge. At present, no open standard exists for the interchange of parametric and generative information. The Industry Foundation Classes (IFC) which are an open standard for interoperability in building information models is presented as the base for an open standard in parametric modelling. The advantage of allowing parametric and generative representations are that the early design process can allow for more iteration and changes can be implemented quicker than with traditional models. This paper begins with a formal definition of what constitutes to be parametric and generative modelling methods and then proceeds to describe an open standard in which the interchange of components could be implemented. As an illustrative example of generative design, Frazer’s ‘Reptiles’ project from 1968 is reinterpreted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Qualitative Health Psychology aims to contribute to the debate about the nature of psychology and of science through ‘an examination of the role of qualitative research within health psychology’ (p. 3). The editors, in bringing together contributors from the UK, Ireland, Canada, Brazil, New Zealand and Australia, have compiled a text that reflects different uses of qualitative health research in diverse social and cultural contexts. Structured into three parts, the book encompasses key theoretical and methodological issues in qualitative research in its attempt to encourage broad epistemological debate within health psychology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The appearance of poststructuralism as a research methodology in public health literature raises questions about the history and purpose of this research. We examine (a) some aspects of the history of qualitative methods and their place within larger social and research domains, and (b) the purposes of a public health research that employs poststructuralist philosophy delineating the methodological issues that require consideration in positing a poststructural analysis. We argue against poststructuralism becoming a research methodology deployed to seize the pubic health debate, rather than being employed for its own particular critical strengths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation analyses how physical objects are translated into digital artworks using techniques which can lead to ‘imperfections’ in the resulting digital artwork that are typically removed to arrive at a ‘perfect’ final representation. The dissertation discusses the adaptation of existing techniques into an artistic workflow that acknowledges and incorporates the imperfections of translation into the final pieces. It presents an exploration of the relationship between physical and digital artefacts and the processes used to move between the two. The work explores the 'craft' of digital sculpting and the technology used in producing what the artist terms ‘a naturally imperfect form’, incorporating knowledge of traditional sculpture, an understanding of anatomy and an interest in the study of bones (Osteology). The outcomes of the research are presented as a series of digital sculptural works, exhibited as a collection of curiosities in multiple mediums, including interactive game spaces, augmented reality (AR), rapid prototype prints (RP) and video displays.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Autonomous acoustic recorders are widely available and can provide a highly efficient method of species monitoring, especially when coupled with software to automate data processing. However, the adoption of these techniques is restricted by a lack of direct comparisons with existing manual field surveys. 2. We assessed the performance of autonomous methods by comparing manual and automated examination of acoustic recordings with a field-listening survey, using commercially available autonomous recorders and custom call detection and classification software. We compared the detection capability, time requirements, areal coverage and weather condition bias of these three methods using an established call monitoring programme for a nocturnal bird, the little spotted kiwi(Apteryx owenii). 3. The autonomous recorder methods had very high precision (>98%) and required <3% of the time needed for the field survey. They were less sensitive, with visual spectrogram inspection recovering 80% of the total calls detected and automated call detection 40%, although this recall increased with signal strength. The areal coverage of the spectrogram inspection and automatic detection methods were 85% and 42% of the field survey. The methods using autonomous recorders were more adversely affected by wind and did not show a positive association between ground moisture and call rates that was apparent from the field counts. However, all methods produced the same results for the most important conservation information from the survey: the annual change in calling activity. 4. Autonomous monitoring techniques incur different biases to manual surveys and so can yield different ecological conclusions if sampling is not adjusted accordingly. Nevertheless, the sensitivity, robustness and high accuracy of automated acoustic methods demonstrate that they offer a suitable and extremely efficient alternative to field observer point counts for species monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.