540 resultados para Random matrix theory
Resumo:
Analytical expressions are derived for the mean and variance, of estimates of the bispectrum of a real-time series assuming a cosinusoidal model. The effects of spectral leakage, inherent in discrete Fourier transform operation when the modes present in the signal have a nonintegral number of wavelengths in the record, are included in the analysis. A single phase-coupled triad of modes can cause the bispectrum to have a nonzero mean value over the entire region of computation owing to leakage. The variance of bispectral estimates in the presence of leakage has contributions from individual modes and from triads of phase-coupled modes. Time-domain windowing reduces the leakage. The theoretical expressions for the mean and variance of bispectral estimates are derived in terms of a function dependent on an arbitrary symmetric time-domain window applied to the record. the number of data, and the statistics of the phase coupling among triads of modes. The theoretical results are verified by numerical simulations for simple test cases and applied to laboratory data to examine phase coupling in a hypothesis testing framework
Resumo:
The CDKN2 gene, encoding the cyclin-dependent kinase inhibitor p16, is a tumour suppressor gene that maps to chromosome band 9p21-p22. The most common mechanism of inactivation of this gene in human cancers is through homozygous deletion; however, in a smaller proportion of tumours and tumour cell lines intragenic mutations occur. In this study we have compiled a database of over 120 published point mutations in the CDKN2 gene from a wide variety of tumour types. A further 50 deletions, insertions, and splice mutations in CDKN2 have also been compiled. Furthermore, we have standardised the numbering of all mutations according to the full-length 156 amino acid form of p16. From this study we are able to define several hot spots, some of which occur at conserved residues within the ankyrin domains of p16. While many of the hotspots are shared by a number of cancers, the relative importance of each position varies, possibly reflecting the role of different carcinogens in the development of certain tumours. As reported previously, the mutational spectrum of CDKN2 in melanomas differs from that of internal malignancies and supports the involvement of UV in melanoma tumorigenesis. Notably, 52% of all substitutions in melanoma-derived samples occurred at just six nucleotide positions. Nonsense mutations comprise a comparatively high proportion of mutations present in the CDKN2 gene, and possible explanations for this are discussed.
Resumo:
Levels of waste within the construction industry need to be reduced for environmental and economic reasons. Changing people's wasteful behaviour can make a significant contribution. This paper describes a research project that used Ajzen's 'theory of planned behaviour' to investigate the attitudinal forces that shape behaviour at the operative level. It concludes that operatives see waste as an inevitable by-product of construction activity. Attitudes towards waste management are not negative, although they are pragmatic and impeded by perceptions of a lack of managerial commitment. Waste management is perceived as a low project priority, and there is an absence of appropriate resources and incentives to support it. A theory of waste behaviour is proposed for the construction industry, and recommendations are made to help managers improve operatives' attitudes towards waste.
Resumo:
This paper offers a reply to Jochen Runde's critical appraisal of the ontological framework underpinning Dopfer and Potts's (2008) General Theory of Economic Evolution. We argue that Runde's comprehensive critique contains several of what we perceive to be misunderstandings in relation to the key concepts of ‘generic’ and ‘meso’ that we seek here to unpack and redress.
Resumo:
The stochastic simulation algorithm was introduced by Gillespie and in a different form by Kurtz. There have been many attempts at accelerating the algorithm without deviating from the behavior of the simulated system. The crux of the explicit τ-leaping procedure is the use of Poisson random variables to approximate the number of occurrences of each type of reaction event during a carefully selected time period, τ. This method is acceptable providing the leap condition, that no propensity function changes “significantly” during any time-step, is met. Using this method there is a possibility that species numbers can, artificially, become negative. Several recent papers have demonstrated methods that avoid this situation. One such method classifies, as critical, those reactions in danger of sending species populations negative. At most, one of these critical reactions is allowed to occur in the next time-step. We argue that the criticality of a reactant species and its dependent reaction channels should be related to the probability of the species number becoming negative. This way only reactions that, if fired, produce a high probability of driving a reactant population negative are labeled critical. The number of firings of more reaction channels can be approximated using Poisson random variables thus speeding up the simulation while maintaining the accuracy. In implementing this revised method of criticality selection we make use of the probability distribution from which the random variable describing the change in species number is drawn. We give several numerical examples to demonstrate the effectiveness of our new method.
Resumo:
The study of venture idea characteristics and the contextual fit between venture ideas and individuals are key research goals in entrepreneurship (Davidsson, 2004). However, to date there has been limited scholarly attention given to these phenomena. Accordingly, this study aims to help fill the gap by investigating the importance of novelty and relatedness of venture ideas in entrepreneurial firms. On the premise that new venture creation is a process and that research should be focused on the early stages of the venturing process, this study primarily focuses its attention on examining how venture idea novelty and relatedness affect the performance in the venture creation process. Different types and degrees of novelty are considered here. Relatedness is shown to be based on individuals’ prior knowledge and resource endowment. Performance in the venture creation process is evaluated according to four possible outcomes: making progress, getting operational, being terminated and achieving positive cash flow. A theoretical model is developed demonstrating the relationship between these variables along with the investment of time and money. Several hypotheses are developed to be tested. Among them, it is hypothesised that novelty hinders short term performance in the venture creation process. On the other hand knowledge and resource relatedness are hypothesised to promote performance. An experimental study was required in order to understand how different types and degrees of novelty and relatedness of venture ideas affect the attractiveness of venture ideas in the eyes of experienced entrepreneurs. Thus, the empirical work in this thesis was based on two separate studies. In the first one, a conjoint analysis experiment was conducted on 32 experienced entrepreneurs in order to ascertain attitudinal preferences regarding venture idea attractiveness based on novelty, relatedness and potential financial gains. This helped to estimate utility values for different levels of different attributes of venture ideas and their relative importance in the attractiveness. The second study was a longitudinal investigation of how venture idea novelty and relatedness affect the performance in the venture creation process. The data for this study is from the Comprehensive Australian Study for Entrepreneurial Emergence (CAUSEE) project that has been established in order to explore the new venture creation process in Australia. CAUSEE collects data from a representative sample of over 30,000 households in Australia using random digit dialling (RDD) telephone interviews. From these cases, data was collected at two points in time during a 12 month period from 493 firms, who are currently involved in the start-up process. Hypotheses were tested and inferences were derived through descriptive statistics, confirmatory factor analysis and structural equation modelling. Results of study 1 indicate that venture idea characteristics have a role in the attractiveness and entrepreneurs prefer to introduce a moderate degree of novelty across all types of venture ideas concerned. Knowledge relatedness is demonstrated to be a more significant factor in attractiveness than resource relatedness. Results of study 2 show that the novelty hinders nascent venture performance. On the other hand, resource relatedness has a positive impact on performance unlike knowledge relatedness which has none. The results of these studies have important implications for potential entrepreneurs, investors, researchers, consultants etc. by developing a better understanding in the venture creation process and its success factors in terms of both theory and practice.
Resumo:
Web service technology is increasingly being used to build various e-Applications, in domains such as e-Business and e-Science. Characteristic benefits of web service technology are its inter-operability, decoupling and just-in-time integration. Using web service technology, an e-Application can be implemented by web service composition — by composing existing individual web services in accordance with the business process of the application. This means the application is provided to customers in the form of a value-added composite web service. An important and challenging issue of web service composition, is how to meet Quality-of-Service (QoS) requirements. This includes customer focused elements such as response time, price, throughput and reliability as well as how to best provide QoS results for the composites. This in turn best fulfils customers’ expectations and achieves their satisfaction. Fulfilling these QoS requirements or addressing the QoS-aware web service composition problem is the focus of this project. From a computational point of view, QoS-aware web service composition can be transformed into diverse optimisation problems. These problems are characterised as complex, large-scale, highly constrained and multi-objective problems. We therefore use genetic algorithms (GAs) to address QoS-based service composition problems. More precisely, this study addresses three important subproblems of QoS-aware web service composition; QoS-based web service selection for a composite web service accommodating constraints on inter-service dependence and conflict, QoS-based resource allocation and scheduling for multiple composite services on hybrid clouds, and performance-driven composite service partitioning for decentralised execution. Based on operations research theory, we model the three problems as a constrained optimisation problem, a resource allocation and scheduling problem, and a graph partitioning problem, respectively. Then, we present novel GAs to address these problems. We also conduct experiments to evaluate the performance of the new GAs. Finally, verification experiments are performed to show the correctness of the GAs. The major outcomes from the first problem are three novel GAs: a penaltybased GA, a min-conflict hill-climbing repairing GA, and a hybrid GA. These GAs adopt different constraint handling strategies to handle constraints on interservice dependence and conflict. This is an important factor that has been largely ignored by existing algorithms that might lead to the generation of infeasible composite services. Experimental results demonstrate the effectiveness of our GAs for handling the QoS-based web service selection problem with constraints on inter-service dependence and conflict, as well as their better scalability than the existing integer programming-based method for large scale web service selection problems. The major outcomes from the second problem has resulted in two GAs; a random-key GA and a cooperative coevolutionary GA (CCGA). Experiments demonstrate the good scalability of the two algorithms. In particular, the CCGA scales well as the number of composite services involved in a problem increases, while no other algorithms demonstrate this ability. The findings from the third problem result in a novel GA for composite service partitioning for decentralised execution. Compared with existing heuristic algorithms, the new GA is more suitable for a large-scale composite web service program partitioning problems. In addition, the GA outperforms existing heuristic algorithms, generating a better deployment topology for a composite web service for decentralised execution. These effective and scalable GAs can be integrated into QoS-based management tools to facilitate the delivery of feasible, reliable and high quality composite web services.
Resumo:
In this paper a new graph-theory and improved genetic algorithm based practical method is employed to solve the optimal sectionalizer switch placement problem. The proposed method determines the best locations of sectionalizer switching devices in distribution networks considering the effects of presence of distributed generation (DG) in fitness functions and other optimization constraints, providing the maximum number of costumers to be supplied by distributed generation sources in islanded distribution systems after possible faults. The proposed method is simulated and tested on several distribution test systems in both cases of with DG and non DG situations. The results of the simulations validate the proposed method for switch placement of the distribution network in the presence of distributed generation.
Resumo:
Business Process Management (BPM) is a topic that continues to grow in significance as organisations seek to gain and sustain competitive advantage in an increasingly global environment. Despite anecdotal evidence of organisations improving performance by pursuing a BPM approach, there is little theory that explains and substantiates this relationship. This study provides the first theory on the progression and maturity of BPM Initiatives within organisations and provides a vital starting block upon which future research in this area can build. The Researcher starts by clearly defining three key terms (BPM Initiative, BPM Progression and BPM Maturity), showing the relationship between these three concepts and proposing their relationship with Organisational Performance. The Researcher then combines extant literature and use of the Delphi Technique and the case study method to explore the progression and measurement of the BPM Initiatives within organisations. The study builds upon the principles of general theories including the Punctuated Equilibrium Model and Dynamic Capabilities to present theory on BPM Progression and BPM Maturity. Using the BPM Capability Framework developed through an international Delphi study series, the Researcher shows how the specific organisational context influences which capability areas an organisation chooses to progress. By comparing five separate organisations over an extended time the Researcher is able to show that, despite this disparity, there is some evidence of consistency with regard to the capability areas progressed. This suggests that subsequent identification of progression paths may be possible. The study also shows that the approach and scope taken to BPM within each organisation is a likely predictor of such paths. These outcomes result in the proposal of a formative model for measuring BPM Maturity.
Resumo:
A crucial contemporary policy question for governments across the globe is how to cope with international crime and terrorist networks. Many such “dark” networks—that is, networks that operate covertly and illegally—display a remarkable level of resilience when faced with shocks and attacks. Based on an in-depth study of three cases (MK, the armed wing of the African National Congress in South Africa during apartheid; FARC, the Marxist guerrilla movement in Colombia; and the Liberation Tigers of Tamil Eelam, LTTE, in Sri Lanka), we present a set of propositions to outline how shocks impact dark network characteristics (resources and legitimacy) and networked capabilities (replacing actors, linkages, balancing integration and differentiation) and how these in turn affect a dark network's resilience over time. We discuss the implications of our findings for policymakers.
Resumo:
There is a lack of writing on the issue of the education rights of people with disabilities by authors of any theoretical persuasion. While the deficiency of theory may be explained by a variety of historical, philosophical and practical considerations, it is a deficiency which must be addressed. Otherwise, any statement of rights rings out as hollow rhetoric unsupported by sound reason and moral rectitude. This paper attempts to address this deficiency in education rights theory by postulating a communitarian theory of the education rights of people with disabilities. The theory is developed from communitarian writings on the role of education in democratic society. The communitarian school, like the community within which it nests, is inclusive. Schools both reflect and model the shape of communitarian society and have primary responsibility for teaching the knowledge and virtues which will allow citizens to belong to and function within society. Communitarians emphasise responsibilities, however, as the corollary of rights and require the individual good to yield to community good when the hard cases arise. The article not only explains the basis of the right to an inclusive education, therefore, but also engages with the difficult issue of when such a right may not be enforceable.
Resumo:
Queensland's new State Planning Policy for Coastal Protection, released in March and approved in April 2011 as part of the Queensland Coastal Plan, stipulates that local governments prepare and implement adaptation strategies for built up areas projected to be subject to coastal hazards between present day and 2100. Urban localities within the delineated coastal high hazard zone (as determined by models incorporating a 0.8 meter rise in sea level and a 10% increase in the maximum cyclone activity) will be required to re-evaluate their plans to accommodate growth, revising land use plans to minimise impacts of anticipated erosion and flooding on developed areas and infrastructure. While implementation of such strategies would aid in avoidance or minimisation of risk exposure, communities are likely to face significant challenges in such implementation, especially as development in Queensland is so intensely focussed upon its coasts with these new policies directing development away from highly desirable waterfront land. This paper examines models of planning theory to understand how we plan when faced with technically complex problems towards formulation of a framework for evaluating and improving practice.
Resumo:
The concept of knowledge-based urban development has first come to the urban planning and development agenda during the very last years of the 20th century as a promising paradigm to support the transformation process of cities into knowledge cities and their societies into knowledge societies. However, soon after the exponentially rapid advancements experienced, during the first decade of the 21st century, particularly, in the domains of economy, society, management and technology along with the severe impacts of climate change, have made the redefinition of the term a necessity. This paper, first, reports the findings of the review of the relatively short but dynamic history of urbanisation experiences of our cities around the globe. The paper, then, focuses on the 21st century urbanisation context and discusses the conceptual base of the knowledge-based development of cities and how this concept found application ground in many parts of the world. Following this, the paper speculates development of future cities by particularly highlighting potential challenges and opportunities that previously have not been fully considered. This paper, lastly, introduces and elaborates how relevant theories support the better conceptualisation of this relatively new, but rapidly emerging paradigm, and redefines it accordingly.
Resumo:
Using a thematic story telling approach which draws on ethnographic method, a grounded theory of protest movement continuity is presented. The grounded theory draws from theories and activist stories relating to the facilitative role of movement networks, social contagion theory and the cultural experience of activism. It highlights the contagious influence of protest networks in maintaining protest continuity over time and how this leads to common perceptions of development risk and opportunity within communities. It also reveals how communities use collective values and identity, social capital, emotional dynamics and symbolic artifacts to maintain protest continuity.
Resumo:
This article augments Resource Dependence Theory with Real Options reasoning in order to explain time bounds specification in strategic alliances. Whereas prior work has found about a 50/50 split between alliances that are time bound and those that are open-ended, their substantive differences and antecedents are ill understood. To address this, we suggest that the two alliance modes present different real options trade-offs in adaptation to environmental uncertainty: ceteris paribus, time-bound alliances are likely to provide abandonment options over open-ended alliances, but require additional investments to extend the alliance when this turns out to be desirable after formation. Open-ended alliances are likely to provide growth options over open-ended alliances, but they demand additional effort to abandon the alliance if post-formation circumstances so desire. Therefore, we expect time bounds specification to be a function of environmental uncertainty: organizations in more uncertain environments will be relatively more likely to place time bounds on their strategic alliances. Longitudinal archival and survey data collected amongst 39 industry clusters provides empirical support for our claims, which contribute to the recent renaissance of resource dependence theory by specifying the conditions under which organizations choose different time windows in strategic partnering.