68 resultados para Dynamic search fireworks algorithm with covariance mutation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new spatial logic encompassing redefined concepts of time and place, space and distance, requires a comprehensive shift in the approach to designing workplace environments for today’s adaptive, collaborative organizations operating in a dynamic business world. Together with substantial economic and cultural shifts and an increased emphasis on lifestyle considerations, the advances in information technology have prompted a radical re-ordering of organizational relationships and the associated structures, processes, and places of doing business. Within the duality of space and an augmentation of the traditional notions of place, organizational and institutional structures pose new challenges for the design professions. The literature reveals that there has always been a mono-organizational focus in relation to workplace design strategies and the burgeoning trend towards inter-organizational collaboration, enabled the identification of a gap in the knowledge relative to workplace design. The NetWorkPlaceTM© constitutes a multi-dimensional concept having the capacity to deal with the fluidity and ambiguity characteristic of the network context, as both a topic of research and the way of going about it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of air and bone interfaces makes the dose distribution for head and neck cancer treatments difficult to accurately predict. This study compared planning system dose calculations using the collapsed-cone convolution algorithm with EGSnrcMonte Carlo simulation results obtained using the Monte Carlo DICOMToolKit software, for one oropharynx, two paranasal sinus and three nodal treatment plans. The difference between median doses obtained from the treatment planning and Monte Carlo calculations was found to be greatest in two bilateral treatments: 4.8%for a retropharyngeal node irradiation and 6.7% for an ethmoid paranasal sinus treatment. These deviations in median dose were smaller for two unilateral treatments: 0.8% for an infraclavicular node irradiation and 2.8% for a cervical node treatment. Examination of isodose distributions indicated that the largest deviations between Monte Carlo simulation and collapsed-cone convolution calculations were seen in the bilateral treatments, where the increase in calculated dose beyond air cavities was most significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The standard approach to tax compliance applies the economics-of-crime methodology pioneered by Becker (1968): in its first application, due to Allingham and Sandmo (1972) it models the behaviour of agents as a decision involving a choice of the extent of their income to report to tax authorities, given a certain institutional environment, represented by parameters such as the probability of detection and penalties in the event the agent is caught. While this basic framework yields important insights on tax compliance behavior, it has some critical limitations. Specifically, it indicates a level of compliance that is significantly below what is observed in the data. This thesis revisits the original framework with a view towards addressing this issue, and examining the political economy implications of tax evasion for progressivity in the tax structure. The approach followed involves building a macroeconomic, dynamic equilibrium model for the purpose of examining these issues, by using a step-wise model building procedure starting with some very simple variations of the basic Allingham and Sandmo construct, which are eventually integrated to a dynamic general equilibrium overlapping generations framework with heterogeneous agents. One of the variations involves incorporating the Allingham and Sandmo construct into a two-period model of a small open economy of the type originally attributed to Fisher (1930). A further variation of this simple construct involves allowing agents to initially decide whether to evade taxes or not. In the event they decide to evade, the agents then have to decide the extent of income or wealth they wish to under-report. We find that the ‘evade or not’ assumption has strikingly different and more realistic implications for the extent of evasion, and demonstrate that it is a more appropriate modeling strategy in the context of macroeconomic models, which are essentially dynamic in nature, and involve consumption smoothing across time and across various states of nature. Specifically, since deciding to undertake tax evasion impacts on the consumption smoothing ability of the agent by creating two states of nature in which the agent is ‘caught’ or ‘not caught’, there is a possibility that their utility under certainty, when they choose not to evade, is higher than the expected utility obtained when they choose to evade. Furthermore, the simple two-period model incorporating an ‘evade or not’ choice can be used to demonstrate some strikingly different political economy implications relative to its Allingham and Sandmo counterpart. In variations of the two models that allow for voting on the tax parameter, we find that agents typically choose to vote for a high degree of progressivity by choosing the highest available tax rate from the menu of choices available to them. There is, however, a small range of inequality levels for which agents in the ‘evade or not’ model vote for a relatively low value of the tax rate. The final steps in the model building procedure involve grafting the two-period models with a political economy choice into a dynamic overlapping generations setting with more general, non-linear tax schedules and a ‘cost-of evasion’ function that is increasing in the extent of evasion. Results based on numerical simulations of these models show further improvement in the model’s ability to match empirically plausible levels of tax evasion. In addition, the differences between the political economy implications of the ‘evade or not’ version of the model and its Allingham and Sandmo counterpart are now very striking; there is now a large range of values of the inequality parameter for which agents in the ‘evade or not’ model vote for a low degree of progressivity. This is because, in the ‘evade or not’ version of the model, low values of the tax rate encourages a large number of agents to choose the ‘not-evade’ option, so that the redistributive mechanism is more ‘efficient’ relative to the situations in which tax rates are high. Some further implications of the models of this thesis relate to whether variations in the level of inequality, and parameters such as the probability of detection and penalties for tax evasion matter for the political economy results. We find that (i) the political economy outcomes for the tax rate are quite insensitive to changes in inequality, and (ii) the voting outcomes change in non-monotonic ways in response to changes in the probability of detection and penalty rates. Specifically, the model suggests that changes in inequality should not matter, although the political outcome for the tax rate for a given level of inequality is conditional on whether there is a large or small or large extent of evasion in the economy. We conclude that further theoretical research into macroeconomic models of tax evasion is required to identify the structural relationships underpinning the link between inequality and redistribution in the presence of tax evasion. The models of this thesis provide a necessary first step in that direction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenge of persistent appearance-based navigation and mapping is to develop an autonomous robotic vision system that can simultaneously localize, map and navigate over the lifetime of the robot. However, the computation time and memory requirements of current appearance-based methods typically scale not only with the size of the environment but also with the operation time of the platform; also, repeated revisits to locations will develop multiple competing representations which reduce recall performance. In this paper we present a solution to the persistent localization, mapping and global path planning problem in the context of a delivery robot in an office environment over a one-week period. Using a graphical appearance-based SLAM algorithm, CAT-Graph, we demonstrate constant time and memory loop closure detection with minimal degradation during repeated revisits to locations, along with topological path planning that improves over time without using a global metric representation. We compare the localization performance of CAT-Graph to openFABMAP, an appearance-only SLAM algorithm, and the path planning performance to occupancy-grid based metric SLAM. We discuss the limitations of the algorithm with regard to environment change over time and illustrate how the topological graph representation can be coupled with local movement behaviors for persistent autonomous robot navigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Speaker diarization is the process of annotating an input audio with information that attributes temporal regions of the audio signal to their respective sources, which may include both speech and non-speech events. For speech regions, the diarization system also specifies the locations of speaker boundaries and assign relative speaker labels to each homogeneous segment of speech. In short, speaker diarization systems effectively answer the question of ‘who spoke when’. There are several important applications for speaker diarization technology, such as facilitating speaker indexing systems to allow users to directly access the relevant segments of interest within a given audio, and assisting with other downstream processes such as summarizing and parsing. When combined with automatic speech recognition (ASR) systems, the metadata extracted from a speaker diarization system can provide complementary information for ASR transcripts including the location of speaker turns and relative speaker segment labels, making the transcripts more readable. Speaker diarization output can also be used to localize the instances of specific speakers to pool data for model adaptation, which in turn boosts transcription accuracies. Speaker diarization therefore plays an important role as a preliminary step in automatic transcription of audio data. The aim of this work is to improve the usefulness and practicality of speaker diarization technology, through the reduction of diarization error rates. In particular, this research is focused on the segmentation and clustering stages within a diarization system. Although particular emphasis is placed on the broadcast news audio domain and systems developed throughout this work are also trained and tested on broadcast news data, the techniques proposed in this dissertation are also applicable to other domains including telephone conversations and meetings audio. Three main research themes were pursued: heuristic rules for speaker segmentation, modelling uncertainty in speaker model estimates, and modelling uncertainty in eigenvoice speaker modelling. The use of heuristic approaches for the speaker segmentation task was first investigated, with emphasis placed on minimizing missed boundary detections. A set of heuristic rules was proposed, to govern the detection and heuristic selection of candidate speaker segment boundaries. A second pass, using the same heuristic algorithm with a smaller window, was also proposed with the aim of improving detection of boundaries around short speaker segments. Compared to single threshold based methods, the proposed heuristic approach was shown to provide improved segmentation performance, leading to a reduction in the overall diarization error rate. Methods to model the uncertainty in speaker model estimates were developed, to address the difficulties associated with making segmentation and clustering decisions with limited data in the speaker segments. The Bayes factor, derived specifically for multivariate Gaussian speaker modelling, was introduced to account for the uncertainty of the speaker model estimates. The use of the Bayes factor also enabled the incorporation of prior information regarding the audio to aid segmentation and clustering decisions. The idea of modelling uncertainty in speaker model estimates was also extended to the eigenvoice speaker modelling framework for the speaker clustering task. Building on the application of Bayesian approaches to the speaker diarization problem, the proposed approach takes into account the uncertainty associated with the explicit estimation of the speaker factors. The proposed decision criteria, based on Bayesian theory, was shown to generally outperform their non- Bayesian counterparts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid growth of visual information on Web has led to immense interest in multimedia information retrieval (MIR). While advancement in MIR systems has achieved some success in specific domains, particularly the content-based approaches, general Web users still struggle to find the images they want. Despite the success in content-based object recognition or concept extraction, the major problem in current Web image searching remains in the querying process. Since most online users only express their needs in semantic terms or objects, systems that utilize visual features (e.g., color or texture) to search images create a semantic gap which hinders general users from fully expressing their needs. In addition, query-by-example (QBE) retrieval imposes extra obstacles for exploratory search because users may not always have the representative image at hand or in mind when starting a search (i.e. the page zero problem). As a result, the majority of current online image search engines (e.g., Google, Yahoo, and Flickr) still primarily use textual queries to search. The problem with query-based retrieval systems is that they only capture users’ information need in terms of formal queries;; the implicit and abstract parts of users’ information needs are inevitably overlooked. Hence, users often struggle to formulate queries that best represent their needs, and some compromises have to be made. Studies of Web search logs suggest that multimedia searches are more difficult than textual Web searches, and Web image searching is the most difficult compared to video or audio searches. Hence, online users need to put in more effort when searching multimedia contents, especially for image searches. Most interactions in Web image searching occur during query reformulation. While log analysis provides intriguing views on how the majority of users search, their search needs or motivations are ultimately neglected. User studies on image searching have attempted to understand users’ search contexts in terms of users’ background (e.g., knowledge, profession, motivation for search and task types) and the search outcomes (e.g., use of retrieved images, search performance). However, these studies typically focused on particular domains with a selective group of professional users. General users’ Web image searching contexts and behaviors are little understood although they represent the majority of online image searching activities nowadays. We argue that only by understanding Web image users’ contexts can the current Web search engines further improve their usefulness and provide more efficient searches. In order to understand users’ search contexts, a user study was conducted based on university students’ Web image searching in News, Travel, and commercial Product domains. The three search domains were deliberately chosen to reflect image users’ interests in people, time, event, location, and objects. We investigated participants’ Web image searching behavior, with the focus on query reformulation and search strategies. Participants’ search contexts such as their search background, motivation for search, and search outcomes were gathered by questionnaires. The searching activity was recorded with participants’ think aloud data for analyzing significant search patterns. The relationships between participants’ search contexts and corresponding search strategies were discovered by Grounded Theory approach. Our key findings include the following aspects: - Effects of users' interactive intents on query reformulation patterns and search strategies - Effects of task domain on task specificity and task difficulty, as well as on some specific searching behaviors - Effects of searching experience on result expansion strategies A contextual image searching model was constructed based on these findings. The model helped us understand Web image searching from user perspective, and introduced a context-aware searching paradigm for current retrieval systems. A query recommendation tool was also developed to demonstrate how users’ query reformulation contexts can potentially contribute to more efficient searching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Nursing in the cardiac catheterisation laboratory (CCL) varies globally in terms of scope and deployment. In the US, all allied staff are cross-trained into all CCL roles. The Australian and New Zealand experience has legislative frameworks that reserves specific functions to nurses. Yet, the nursing role within the CCL is poorly researched and defined. Aim: This study sought to gain deeper understanding of the perceived role of CCL nurses in Australia and New Zealand. Method: A descriptive qualitative study using semi-structured in-depth interviews was used. A cross-sectional sample of 23 senior clinical nurses or nursing managers representing 16 CCLs across Australia and New Zealand was obtained. Data were digitally recorded and transcribed verbatim prior to analysis by three researchers. Results: Five major themes emerged from the data. These themes were: 1. The CCL is a unique environment; 2. CCL nursing is a unique and advanced cardiac nursing discipline; 3. The recruitment attributes for CCL nurses are advanced; 4. Education needs to be standardised; and 5. The evidence to support practice is poor. Discussion: The CCL environment is a dynamic, deeply interdisciplinary setting with CCL nursing seen to be a unique advanced practice role. Yet the time has come for a scope of practice, educational standards, guidelines and competencies was expressed by the participants. Conclusion: Nursing in the CCL is an advanced practice role working within a complex interdisciplinary environment. Further work is required to define the role of CCL nurses together with the evidence-base for their practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, botnet, a network of compromised computers, has been recognized as the biggest threat to the Internet. The bots in a botnet communicate with the botnet owner via a communication channel called Command and Control (C & C) channel. There are three main C & C channels: Internet Relay Chat (IRC), Peer-to-Peer (P2P) and web-based protocols. By exploiting the flexibility of the Web 2.0 technology, the web-based botnet has reached a new level of sophistication. In August 2009, such botnet was found on Twitter, one of the most popular Web 2.0 services. In this paper, we will describe a new type of botnet that uses Web 2.0 service as a C & C channel and a temporary storage for their stolen information. We will then propose a novel approach to thwart this type of attack. Our method applies a unique identifier of the computer, an encryption algorithm with session keys and a CAPTCHA verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Murine models with modified gene function as a result of N-ethyl-N-nitrosourea (ENU) mutagenesis have been used to study phenotypes resulting from genetic change. This study investigated genetic factors associated with red blood cell (RBC) physiology and structural integrity that may impact on blood component storage and transfusion outcome. Forward and reverse genetic approaches were employed with pedigrees of ENU-treated mice using a homozygous recessive breeding strategy. In a “forward genetic” approach, pedigree selection was based upon identification of an altered phenotype followed by exome sequencing to identify a causative mutation. In a second strategy, a “reverse genetic” approach based on selection of pedigrees with mutations in genes of interest was utilised and, following breeding to homozygosity, phenotype assessed. Thirty-three pedigrees were screened by the forward genetic approach. One pedigree demonstrated reticulocytosis, microcytic anaemia and thrombocytosis. Exome sequencing revealed a novel single nucleotide variation (SNV) in Ank1 encoding the RBC structural protein ankyrin-1 and the pedigree was designated Ank1EX34. The reticulocytosis and microcytic anaemia observed in the Ank1EX34 pedigree were similar to clinical features of hereditary spherocytosis in humans. For the reverse genetic approach three pedigrees with different point mutations in Spnb1 encoding RBC protein spectrin-1β, and one pedigree with a mutation in Epb4.1, encoding band 4.1 were selected for study. When bred to homozygosity two of the spectrin-1β pedigrees (a, b) demonstrated increased RBC count, haemoglobin (Hb) and haematocrit (HCT). The third Spnb1 mutation (spectrin-1β c) and mutation in Epb4.1 (band 4.1) did not significantly affect the haematological phenotype, despite these two mutations having a PolyPhen score predicting the mutation may be damaging. Exome sequencing allows rapid identification of causative mutations and development of databases of mutations predicted to be disruptive. These tools require further refinement but provide new approaches to the study of genetically defined changes that may impact on blood component storage and transfusion outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a control design for tracking of attitude and speed of an underactuated slender-hull unmanned underwater vehicle (UUV). The control design is based on Port-Hamiltonian theory. The target dynamics (desired dynamic response) is shaped with particular attention to the target mass matrix so that the influence of the unactuated dynamics on the controlled system is suppressed. This results in achievable dynamics independent of uncontrolled states. Throughout the design, insight of the physical phenomena involved is used to propose the desired target dynamics. The performance of the design is demonstrated through simulation with a high-fidelity model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sphingosine 1-phosphate (SPP), a bioactive sphingolipid metabolite, inhibits chemoinvasiveness of the aggressive, estrogen-independent MDA-MB-231 human breast cancer cell line. As in many other cell types, SPP stimulated proliferation of MDA-MB-231 cells, albeit to a lesser extent. Treatment of MDA-MB-231 cells with SPP had no significant effect on their adhesiveness to Matrigel, and only high concentrations of SPP partially inhibited matrix metalloproteinase-2 activation induced by Con A. However, SPP at a concentration that strongly inhibited invasiveness also markedly reduced chemotactic motility. To investigate the molecular mechanisms by which SPP interferes with cell motility, we examined tyrosine phosphorylation of focal adhesion kinase (FAK) and paxillin, which are important for organization of focal adhesions and cell motility. SPP rapidly increased tyrosine phosphorylation of FAK and paxillin and of the paxillin-associated protein Crk. Overexpression of FAK and kinase-defective FAK in MDA-MB-231 cells resulted in a slight increase in motility without affecting the inhibitory effect of SPP, whereas expression of FAK with a mutation of the major autophosphorylation site (F397) abolished the inhibitory effect of SPP on cell motility. In contrast, the phosphoinositide 3'-kinase inhibitor, wortmannin, inhibited chemotactic motility in both vector and FAK-F397- transfected cells. Our results suggest that autophosphorylation of FAK on Y397 may play an important role in SPP signaling leading to decreased cell motility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cleavage and polyadenylation factor (CPF) is a multi‐protein complex that functions in pre‐mRNA 3′‐end formation and in the RNA polymerase II (RNAP II) transcription cycle. Ydh1p/Cft2p is an essential component of CPF but its precise role in 3′‐end processing remained unclear. We found that mutations in YDH1 inhibited both the cleavage and the polyadenylation steps of the 3′‐end formation reaction in vitro. Recently, we demonstrated that an important function of CPF lies in the recognition of poly(A) site sequences and RNA binding analyses suggesting that Ydh1p/Cft2p interacts with the poly(A) site region. Here we show that mutant ydh1 strains are deficient in the recognition of the ACT1 cleavage site in vivo. The C‐terminal domain (CTD) of RNAP II plays a major role in coupling 3′‐end processing and transcription. We provide evidence that Ydh1p/Cft2p interacts with the CTD of RNAP II, several other subunits of CPF and with Pcf11p, a component of CF IA. We propose that Ydh1p/Cft2p contributes to the formation of important interaction surfaces that mediate the dynamic association of CPF with RNAP II, the recognition of poly(A) site sequences and the assembly of the polyadenylation machinery on the RNA substrate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Adherence to medicines is important in subjects with diabetes, as nonadherence is associated with an increased risk of morbidity and mortality. However, it is not clear whether there is an association between adherence to medicines and glycaemic control, as not all studies have shown this. One of the reasons for this discrepancy may be that, although there is a standard measure of glycaemic control i.e. HbA1c, there is no standard measure of adherence to medicines. Adherence to medicines can be measured either qualitatively by Morisky or non-Morisky methods or quantitatively using the medicines possession ratio (MPR). AIMS OF THE REVIEW: The aims of this literature review are (1) to determine whether there is an association between adherence to anti-diabetes medicines and glycaemic control, and (2) whether any such association is dependent on how adherence is measured. Methods A literature search of Medline, CINAHL and the Internet (Google) was undertaken with search terms; 'diabetes' with 'adherence' (or compliance, concordance, persistence, continuation) with 'HbA1c' (or glycaemic control). RESULTS: Twenty-three studies were included; 10 qualitative and 12 quantitative studies, and one study using both methods. For the qualitative methods measurements of adherence to anti-diabetes medicines (non-Morisky and Morisky), eight out of ten studies show an association with HbA1c. Nine of ten studies using the quantitative MPR, and two studies using MPR for insulin only, have also shown an association between adherence to anti-diabetes medicines and HbA1c. However, the one study that used both Morisky and MPR did not show an association. Three of the four studies that did not show a relationship, did not use a range of HbA1c values in their regression analysis. The other study that did not show a relationship was specifically in a low income population. CONCLUSIONS: Most studies show an association between adherence to anti-diabetes medicines and HbA1c levels, and this seems to be independent of method used to measure adherence. However, to show an association it is necessary to have a range of HbA1c values. Also, the association is not always apparent in low income populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The echolocation calls of long-tailed bats (Chalinolobus tuberculatus) were recorded in the Eglinton Valley, Fjordland, New Zealand, and digitized for analysis with the signal-processing software. Univariate and multivariate analyses of measure features facilitated a quantitative classification of the calls. Cluster analysis was used to categorize calls into two groups equating to search and terminal buzz calls described qualitatively for other species. When moving from search to terminal phases, the calls decrease in bandwidth, maximum and minimum frequency of call, and duration. Search calls begin with a steep-downward FM sweep followed by a short, less-modulated component. Buzz calls are FM sweeps. Although not found quantitatively, a broad pre-buzz group of calls also was identified. Ambiguity analysis of calls from the three groups shows that search-phrase calls are well suited to resolving the velocity of targets, and hence, identifying moving targets in a stationary clutter. Pre-buzz and buzz calls are better suited to resolving range, a feature that may aid the bats in capture of evasive prey after it has been identified.