952 resultados para Set theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main aim of this paper is to describe an adaptive re-planning algorithm based on a RRT and Game Theory to produce an efficient collision free obstacle adaptive Mission Path Planner for Search and Rescue (SAR) missions. This will provide UAV autopilots and flight computers with the capability to autonomously avoid static obstacles and No Fly Zones (NFZs) through dynamic adaptive path replanning. The methods and algorithms produce optimal collision free paths and can be integrated on a decision aid tool and UAV autopilots.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This 
paper 
is
 based
 on 
a
 PhD 
thesis
 that investigated how Hollywood’s
dominance 
of
 the
 movie
 industry 
arose
 and
 how
 it
 has
 been
 maintained
over
time.
 Major 
studio
 dominance 
and
 the 
global
 popularity 
of
Hollywood
 movies has
 been
 the 
subject
 of
 numerous 
studies. 
An interdisciplinary
 literature 
review
 of 
the
 economics,
 management,
marketing,
 film,
 media
 and
 culture 
literatures
 identified
 twenty
 different 
single 
or
multiple 
factor
 explanations
 that 
try
 to
 account
 for
Major
 studio
 dominance 
at
 different
 time
 periods
 but
 cannot
comprehensively 
explain 
how 
Hollywood
 acquired
 and 
maintained
 global
 dominance 
for
 nine
 decades. 
Existing 
strategic
 management 
and
marketing
 theories
 were 
integrated 
into
 a 
‘theoretical
 lens’
 that
 enabled
a
 historical
 analysis
 of 
Hollywood’s
 longstanding 
dominance 
of
 the
movie
 business
 to
 be 
undertaken 
from 
a
 strategic
 business
 perspective.
 This
 paper
 concludes
 that 
the
 major 
studios 
rise 
to
 market 
leadership
 and 
enduring
 dominance
 can
 primarily
 be
 explained 
because 
they
 developed
 and 
maintained 
a 
set
 of
 strategic
 marketing
 management
 capabilities 
that
 were
 superior 
to rival
 firms
 and 
rival 
film 
industries. 
It
 is
 argued that 
a 
marketing
 orientation 
and 
effective
 strategic
 marketing
management
 capabilities 
also
 provide
a 
unifying
 theory
 for
 Hollywood’s
enduring 
dominance 
because 
they
 can 
account
 for
 each
 of
 the 
twenty
 previously
 identified
 explanations 
for 
that
 dominance.



Relevância:

20.00% 20.00%

Publicador:

Resumo:

The book examines the correlation between Intellectual Property Law – notably copyright – on the one hand and social and economic development on the other. The main focus of the initial overview is on historical, legal, economic and cultural aspects. Building on that, the work subsequently investigates how intellectual property systems have to be designed in order to foster social and economic growth in developing countries and puts forward theoretical and practical solutions that should be considered and implemented by policy makers, legal experts and the Word Intellectual Property Organization (WIPO).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Student engagement is a key contributor to student achievement and retention. Increasingly, international and Australasian universities are introducing a range of specific initiatives aimed at monitoring and intervening with students who are at risk of disengaging, particularly in their first year of study. A multi-site case study formed the focus of a national learning and teaching project to develop a suite of resources to guide good practice for safeguarding student learning engagement that were consistent with the notions of equity and social justice. Pivotal to the suite of resources is the Social Justice Framework and a set of social justice principles that emerged through a synthesis of existing literature and were further refined through the examination of qualitative data collected across the participating institutions. These social justice principles reflect general notions of equity and social justice, embrace the philosophical position of recognitive social justice, and are presented in an interconnected and co-dependent way within the framework. Participants will be provided with the opportunity to identify and discuss the practical applications of the principles to student engagement activities in their own institutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to determine factors (internal and external) that influenced Canadian provincial (state) politicians when making funding decisions about public libraries. Using the case study methodology, Canadian provincial/state level funding for public libraries in the 2009-10 fiscal year was examined. After reviewing funding levels across the country, three jurisdictions were chosen for the case: British Columbia's budget revealed dramatically decreased funding, Alberta's budget showed dramatically increased funding, and Ontario's budget was unchanged from the previous year. The primary source of data for the case was a series of semi-structured interviews with elected officials and senior bureaucrats from the three jurisdictions. An examination of primary and secondary documents was also undertaken to help set the political and economic context as well as to provide triangulation for the case interviews. The data were analysed to determine whether Cialdini's theory of influence (2001) and specifically any of the six tactics of influence (i.e, commitment and consistency, authority, liking, social proof, scarcity and reciprocity) were instrumental in these budget processes. Findings show the principles of "authority", "consistency and commitment" and "liking" were relevant, and that "liking" were especially important to these decisions. When these decision makers were considering funding for public libraries, they most often used three distinct lenses: the consistency lens (what are my values? what would my party do?), the authority lens (is someone with hierarchical power telling me to do this? are the requests legitimate?), and most importantly, the liking lens (how much do I like and know about the requester?). These findings are consistent with Cialdini's theory, which suggests the quality of some relationships is one of six factors that can most influence a decision maker. The small number of prior research studies exploring the reasons for increases or decreases in public library funding allocation decisions have given little insight into the factors that motivate those politicians involved in the process and the variables that contribute to these decisions. No prior studies have examined the construct of influence in decision making about funding for Canadian public libraries at any level of government. Additionally, no prior studies have examined the construct of influence in decision making within the context of Canadian provincial politics. While many public libraries are facing difficult decisions in the face of uncertain funding futures, the ability of the sector to obtain favourable responses to requests for increases may require a less simplistic approach than previously thought. The ability to create meaningful connections with individuals in many communities and across all levels of government should be emphasised as a key factor in influencing funding decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Theory of the Growth of The Firm by Edith Penrose, first published in 1959, is a seminal contribution to the field of management. Penrose's intention was to create a theory of firm growth which was logically consistent and empirically tractable (Buckley and Casson, 2007). Much attention, however, has been focused on her unintended contribution to the resource-based view (henceforth RBV) (e.g. Kor and Mahoney, 2004; Lockett and Thompson, 2004) rather than her firm growth theory. We feel that this is unfortunate because despite a rapidly growing body of empirical work, conceptual advancement in growth studies has been limited (Davidsson and Wiklund, 2000; Davidsson et ai., 2006; Delmar, 1997; Storey, 1994). The growth literature frequently references Penrose's work, but little explicit testing of her ideas has been undertaken. This is surprising given that Penrose's work remains the most comprehensive theory of growth to date. One explanation is that she did not formality present her arguments, favouring verbal exposition over formalized models (Lockett, 2005; Lockett and Thompson, 2004). However, the central propositions and conclusions of her theory can be operationalized and empirically tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A value-shift began to influence global political thinking in the late 20th century, characterised by recognition of the need for environmentally, socially and culturally sustainable resource development. This shift entailed a move away from thinking of ‘nature’ and ‘culture’ as separate entities – the former existing to serve the latter – toward the possibility of embracing the intrinsic worth of the nonhuman world. Cultural landscape theory recognises ‘nature’ as at once both ‘natural’, and a ‘cultural’ construct. As such, it may offer a framework through which to progress in the quest for ‘sustainable development’. This study makes a contribution to this quest by asking whether contemporary developments in cultural landscape theory can contribute to rehabilitation strategies for Australian open-cut coal mining landscapes. The answer is ‘yes’. To answer the research question, a flexible, ‘emergent’ methodological approach has been used, resulting in the following outcomes. A thematic historical overview of landscape values and resource development in Australia post-1788, and a review of cultural landscape theory literature, contribute to the formation of a new theoretical framework: Reconnecting the Interrupted Landscape. This framework establishes a positive answer to the research question. It also suggests a method of application within the Australian open-cut coal mining landscape, a highly visible exemplar of the resource development landscape. This method is speculatively tested against the rehabilitation strategy of an operating open-cut coal mine, concluding with positive recommendations to the industry, and to government.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research into hyperinsulinemic laminitis has progressed significantly in recent years with the use of the prolonged-euglycemic, hyperinsulinemic clamp (p-EHC). Previous investigations of laminitis pathophysiology have focused on digital vascular dysfunction, inflammation, altered glucose metabolism within the lamellae, and lamellar basement membrane breakdown by metalloproteinases. The etiopathogenesis of laminitis occurring in association with hyperinsulinemia is yet to be fully characterized, but it may not involve these mechanisms. Insulin stimulates cellular proliferation and can also affect other body systems, such as the insulin-like growth factor (IGF) system. Insulin-like growth factor-1 (IGF-1) is structurally homologous to insulin and, like insulin, binds with strong affinity to a specific tyrosine kinase receptor on the cell surface to produce its effects, which include promoting cell proliferation. Receptors for IGF-1 (IGF-1R) are present in the lamellar epidermis. An alternative theory for the pathogenesis of hyperinsulinemic laminitis is that uncontrolled cell proliferation, mediated through both the insulin receptor (InsR) and IGF-1R, leads to lengthening, weakening, and failure of the lamellae. An analysis of the proliferative activity of lamellar epidermal cells during the developmental and acute phases of hyperinsulinemic laminitis, and lamellar gene expression of the InsR and IGF-1R was undertaken.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speaker diarization is the process of annotating an input audio with information that attributes temporal regions of the audio signal to their respective sources, which may include both speech and non-speech events. For speech regions, the diarization system also specifies the locations of speaker boundaries and assign relative speaker labels to each homogeneous segment of speech. In short, speaker diarization systems effectively answer the question of ‘who spoke when’. There are several important applications for speaker diarization technology, such as facilitating speaker indexing systems to allow users to directly access the relevant segments of interest within a given audio, and assisting with other downstream processes such as summarizing and parsing. When combined with automatic speech recognition (ASR) systems, the metadata extracted from a speaker diarization system can provide complementary information for ASR transcripts including the location of speaker turns and relative speaker segment labels, making the transcripts more readable. Speaker diarization output can also be used to localize the instances of specific speakers to pool data for model adaptation, which in turn boosts transcription accuracies. Speaker diarization therefore plays an important role as a preliminary step in automatic transcription of audio data. The aim of this work is to improve the usefulness and practicality of speaker diarization technology, through the reduction of diarization error rates. In particular, this research is focused on the segmentation and clustering stages within a diarization system. Although particular emphasis is placed on the broadcast news audio domain and systems developed throughout this work are also trained and tested on broadcast news data, the techniques proposed in this dissertation are also applicable to other domains including telephone conversations and meetings audio. Three main research themes were pursued: heuristic rules for speaker segmentation, modelling uncertainty in speaker model estimates, and modelling uncertainty in eigenvoice speaker modelling. The use of heuristic approaches for the speaker segmentation task was first investigated, with emphasis placed on minimizing missed boundary detections. A set of heuristic rules was proposed, to govern the detection and heuristic selection of candidate speaker segment boundaries. A second pass, using the same heuristic algorithm with a smaller window, was also proposed with the aim of improving detection of boundaries around short speaker segments. Compared to single threshold based methods, the proposed heuristic approach was shown to provide improved segmentation performance, leading to a reduction in the overall diarization error rate. Methods to model the uncertainty in speaker model estimates were developed, to address the difficulties associated with making segmentation and clustering decisions with limited data in the speaker segments. The Bayes factor, derived specifically for multivariate Gaussian speaker modelling, was introduced to account for the uncertainty of the speaker model estimates. The use of the Bayes factor also enabled the incorporation of prior information regarding the audio to aid segmentation and clustering decisions. The idea of modelling uncertainty in speaker model estimates was also extended to the eigenvoice speaker modelling framework for the speaker clustering task. Building on the application of Bayesian approaches to the speaker diarization problem, the proposed approach takes into account the uncertainty associated with the explicit estimation of the speaker factors. The proposed decision criteria, based on Bayesian theory, was shown to generally outperform their non- Bayesian counterparts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report a comprehensive theoretical study on reaction of methane by Fe4 cluster. This Letter gains insight into the mechanism of the reaction and indicate the Fe4 cluster has strong catalytic effect on the activation reaction of methane. In detail, the results show the cleavage of the first C–H bond is both an energetically and kinetically favourable process and the breaking of the second C–H is the rate-determining step. Moreover, our Letter demonstrates that the different cluster size of iron can not only determine the catalytic activity of methane but also control the product selectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Creative productivity emerges from human interactions (Hartley, 2009, p. 214). In an era when life is lived in rather than with media (Deuze, this issue), this productivity is widely distributed among ephemeral social networks mediated through the internet. Understanding the underlying dynamics of these networks of human interaction is an exciting and challenging task that requires us to come up with new ways of thinking and theorizing. For example, inducting theory from case studies that are designed to show the exceptional dynamics present within single settings can be augmented today by largescale data generation and collections that provide new analytic opportunities to research the diversity and complexity of human interaction. Large-scale data generation and collection is occurring across a wide range of individuals and organisations. This offers a massive field of analysis which internet companies and research labs in particular are keen on exploring. Lazer et al (2009: 721) argue that such analytic potential is transformational for many if not most research fields but that the use of such valuable data must neither remain confined to private companies and government agencies nor to a privileged set of academic researchers whose studies cannot be replicated nor critiqued. In fact, the analytic capacity to have data of such unprecedented scope and scale available not only requires us to analyse what is and could be done with it and by whom (1) but also what it is doing to us, our cultures and societies (2). Part (1) of such analysis is interested in dependencies and their implications. Part (2) of the enquiry embeds part (1) in a larger context that analyses the long-term, complex dynamics of networked human interaction. From the latter perspective we can treat specific phenomena and the methods used to analyse them as moments of evolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The practical number of charge carriers loaded is crucial to the evaluation of the capacity performance of carbon-based electrodes in service, and cannot be easily addressed experimentally. In this paper, we report a density functional theory study of charge carrier adsorption onto zigzag edge-shaped graphene nanoribbons (ZGNRs), both pristine and incorporating edge substitution with boron, nitrogen or oxygen atoms. All edge substitutions are found to be energetically favorable, especially in oxidized environments. The maximal loading of protons onto the substituted ZGNR edges obeys a rule of [8-n-1], where n is the number of valence electrons of the edge-site atom constituting the adsorption site. Hence, a maximum charge loading is achieved with boron substitution. This result correlates in a transparent manner with the electronic structure characteristics of the edge atom. The boron edge atom, characterized by the most empty p band, facilitates more than the other substitutional cases the accommodation of valence electrons transferred from the ribbon, induced by adsorption of protons. This result not only further confirms the possibility of enhancing charge storage performance of carbon-based electrochemical devices through chemical functionalization but also, more importantly, provides the physical rationale for further design strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heteroatom doping on the edge of graphene may serve as an effective way to tune chemical activity of carbon-based electrodes with respect to charge carrier transfer in an aqueous environment. In a step towards developing mechanistic understanding of this phenomenon, we explore herein mechanisms of proton transfer from aqueous solution to pristine and doped graphene edges utilizing density functional theory. Atomic B-, N-, and O- doped edges as well as the native graphene are examined, displaying varying proton affinities and effective interaction ranges with the H3O+ charge carrier. Our study shows that the doped edges characterized by more dispersive orbitals, namely boron and nitrogen, demonstrate more energetically favourable charge carrier exchange compared with oxygen, which features more localized orbitals. Extended calculations are carried out to examine proton transfer from the hydronium ion in the presence of explicit water, with results indicating that the basic mechanistic features of the simpler model are unchanged.