895 resultados para Markov chains. Convergence. Evolutionary Strategy. Large Deviations


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the most disputed issues within the business arena and among academic scholars are which role boards of directors are expected to fulfill, and how they contribute to a company’s success and survival (Monks & Minow, 2008). Recent failures of large corporations worldwide has led corporate governance and strategic management scholars to call for increased board involvement in decision-making (Tricker, 2009) that has paralleled regulators’ requests for higher monitoring and punishments in the case of frauds and misbehaviors (Coffee, 2005)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Bayesian methodologies for solving optimal experimental design problems has increased. Many of these methods have been found to be computationally intensive for design problems that require a large number of design points. A simulation-based approach that can be used to solve optimal design problems in which one is interested in finding a large number of (near) optimal design points for a small number of design variables is presented. The approach involves the use of lower dimensional parameterisations that consist of a few design variables, which generate multiple design points. Using this approach, one simply has to search over a few design variables, rather than searching over a large number of optimal design points, thus providing substantial computational savings. The methodologies are demonstrated on four applications, including the selection of sampling times for pharmacokinetic and heat transfer studies, and involve nonlinear models. Several Bayesian design criteria are also compared and contrasted, as well as several different lower dimensional parameterisation schemes for generating the many design points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the space fractional advection–dispersion equation, which is obtained from the classical advection–diffusion equation by replacing the spatial derivatives with a generalised derivative of fractional order. We derive a finite volume method that utilises fractionally-shifted Grünwald formulae for the discretisation of the fractional derivative, to numerically solve the equation on a finite domain with homogeneous Dirichlet boundary conditions. We prove that the method is stable and convergent when coupled with an implicit timestepping strategy. Results of numerical experiments are presented that support the theoretical analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to implement a Game-Theory based offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. The goal of this work is then to develop a Multi-Objective (MO) optimisation tool able to provide a set of optimal solutions for the inspection task, given the environment data, the mission requirements and the definition of the objectives to minimise. Results indicate the robustness and capability of the method to find the trade-off between the Pareto-optimal solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The convergence of corporate social responsibility (CSR) and corporate governance has immense impact on the participants in global supply chains. The global buyers and retailers tend to incorporate CSR in all stages of product manufacturing within their supply chains. The incorporated CSR thus creates the difficulty to small- and medium-sized manufacturing enterprises (SMEs). Incompetence in standardized CSR practices is an important issue that causes SMEs either losing their scope to access global market directly or serving as subcontractors to large enterprises. This article explores this issue by focusing on Bangladeshi SMEs under the CSR requirement of the important global buyer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The selection of optimal camera configurations (camera locations, orientations etc.) for multi-camera networks remains an unsolved problem. Previous approaches largely focus on proposing various objective functions to achieve different tasks. Most of them, however, do not generalize well to large scale networks. To tackle this, we introduce a statistical formulation of the optimal selection of camera configurations as well as propose a Trans-Dimensional Simulated Annealing (TDSA) algorithm to effectively solve the problem. We compare our approach with a state-of-the-art method based on Binary Integer Programming (BIP) and show that our approach offers similar performance on small scale problems. However, we also demonstrate the capability of our approach in dealing with large scale problems and show that our approach produces better results than 2 alternative heuristics designed to deal with the scalability issue of BIP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates escalation of intra-familial conflicts in family top management teams. Using a Critical Incident Technique approach, this study uses interviews to collect data from 23 family and non-family individuals and groups within six large-scale privately-held family businesses in Indonesia. The study develops a theoretical model to explain why family business conflicts escalate and become destructive. An inductive content analysis found that the use of a dominating strategy by both parties in dealing with conflict, the expression of negative emotions, and the involvement of non-family employees are more likely to cause escalation. This study contributes to the theory of family business conflict to help family business more satisfying and productive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing is an emerging computing paradigm in which IT resources are provided over the Internet as a service to users. One such service offered through the Cloud is Software as a Service or SaaS. SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. SaaS is receiving substantial attention today from both software providers and users. It is also predicted to has positive future markets by analyst firms. This raises new challenges for SaaS providers managing SaaS, especially in large-scale data centres like Cloud. One of the challenges is providing management of Cloud resources for SaaS which guarantees maintaining SaaS performance while optimising resources use. Extensive research on the resource optimisation of Cloud service has not yet addressed the challenges of managing resources for composite SaaS. This research addresses this gap by focusing on three new problems of composite SaaS: placement, clustering and scalability. The overall aim is to develop efficient and scalable mechanisms that facilitate the delivery of high performance composite SaaS for users while optimising the resources used. All three problems are characterised as highly constrained, large-scaled and complex combinatorial optimisation problems. Therefore, evolutionary algorithms are adopted as the main technique in solving these problems. The first research problem refers to how a composite SaaS is placed onto Cloud servers to optimise its performance while satisfying the SaaS resource and response time constraints. Existing research on this problem often ignores the dependencies between components and considers placement of a homogenous type of component only. A precise problem formulation of composite SaaS placement problem is presented. A classical genetic algorithm and two versions of cooperative co-evolutionary algorithms are designed to now manage the placement of heterogeneous types of SaaS components together with their dependencies, requirements and constraints. Experimental results demonstrate the efficiency and scalability of these new algorithms. In the second problem, SaaS components are assumed to be already running on Cloud virtual machines (VMs). However, due to the environment of a Cloud, the current placement may need to be modified. Existing techniques focused mostly at the infrastructure level instead of the application level. This research addressed the problem at the application level by clustering suitable components to VMs to optimise the resource used and to maintain the SaaS performance. Two versions of grouping genetic algorithms (GGAs) are designed to cater for the structural group of a composite SaaS. The first GGA used a repair-based method while the second used a penalty-based method to handle the problem constraints. The experimental results confirmed that the GGAs always produced a better reconfiguration placement plan compared with a common heuristic for clustering problems. The third research problem deals with the replication or deletion of SaaS instances in coping with the SaaS workload. To determine a scaling plan that can minimise the resource used and maintain the SaaS performance is a critical task. Additionally, the problem consists of constraints and interdependency between components, making solutions even more difficult to find. A hybrid genetic algorithm (HGA) was developed to solve this problem by exploring the problem search space through its genetic operators and fitness function to determine the SaaS scaling plan. The HGA also uses the problem's domain knowledge to ensure that the solutions meet the problem's constraints and achieve its objectives. The experimental results demonstrated that the HGA constantly outperform a heuristic algorithm by achieving a low-cost scaling and placement plan. This research has identified three significant new problems for composite SaaS in Cloud. Various types of evolutionary algorithms have also been developed in addressing the problems where these contribute to the evolutionary computation field. The algorithms provide solutions for efficient resource management of composite SaaS in Cloud that resulted to a low total cost of ownership for users while guaranteeing the SaaS performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Data from two randomized phase III trials were analyzed to evaluate prognostic factors and treatment selection in the first-line management of advanced non-small cell lung cancer patients with performance status (PS) 2. Patients and Methods: Patients randomized to combination chemotherapy (carboplatin and paclitaxel) in one trial and single-agent therapy (gemcitabine or vinorelbine) in the second were included in these analyses. Both studies had identical eligibility criteria and were conducted simultaneously. Comparison of efficacy and safety was performed between the two cohorts. A regression analysis identified prognostic factors and subgroups of patients that may benefit from combination or single-agent therapy. Results: Two hundred one patients were treated with combination and 190 with single-agent therapy. Objective responses were 37 and 15%, respectively. Median time to progression was 4.6 months in the combination arm and 3.5 months in the single-agent arm (p < 0.001). Median survival imes were 8.0 and 6.6 months, and 1-year survival rates were 31 and 26%, respectively. Albumin <3.5 g, extrathoracic metastases, lactate dehydrogenase ≥200 IU, and 2 comorbid conditions predicted outcome. Patients with 0-2 risk factors had similar outcomes independent of treatment, whereas patients with 3-4 factors had a nonsignificant improvement in median survival with combination chemotherapy. Conclusion: Our results show that PS2 non-small cell lung cancer patients are a heterogeneous group who have significantly different outcomes. Patients treated with first-line combination chemotherapy had a higher response and longer time to progression, whereas overall survival did not appear significantly different. A prognostic model may be helpful in selecting PS 2 patients for either treatment strategy. © 2009 by the International Association for the Study of Lung Cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the demand for better and deeper analysis in sports, organizations (both professional teams and broadcasters) are looking to use spatiotemporal data in the form of player tracking information to obtain an advantage over their competitors. However, due to the large volume of data, its unstructured nature, and lack of associated team activity labels (e.g. strategic/tactical), effective and efficient strategies to deal with such data have yet to be deployed. A bottleneck restricting such solutions is the lack of a suitable representation (i.e. ordering of players) which is immune to the potentially infinite number of possible permutations of player orderings, in addition to the high dimensionality of temporal signal (e.g. a game of soccer last for 90 mins). Leveraging a recent method which utilizes a "role-representation", as well as a feature reduction strategy that uses a spatiotemporal bilinear basis model to form a compact spatiotemporal representation. Using this representation, we find the most likely formation patterns of a team associated with match events across nearly 14 hours of continuous player and ball tracking data in soccer. Additionally, we show that we can accurately segment a match into distinct game phases and detect highlights. (i.e. shots, corners, free-kicks, etc) completely automatically using a decision-tree formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes an approach to achieve resilient navigation for indoor mobile robots. Resilient navigation seeks to mitigate the impact of control, localisation, or map errors on the safety of the platform while enforcing the robot’s ability to achieve its goal. We show that resilience to unpredictable errors can be achieved by combining the benefits of independent and complementary algorithmic approaches to navigation, or modalities, each tuned to a particular type of environment or situation. In this paper, the modalities comprise a path planning method and a reactive motion strategy. While the robot navigates, a Hidden Markov Model continually estimates the most appropriate modality based on two types of information: context (information known a priori) and monitoring (evaluating unpredictable aspects of the current situation). The robot then uses the recommended modality, switching between one and another dynamically. Experimental validation with a SegwayRMP- based platform in an office environment shows that our approach enables failure mitigation while maintaining the safety of the platform. The robot is shown to reach its goal in the presence of: 1) unpredicted control errors, 2) unexpected map errors and 3) a large injected localisation fault.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The increasing number of assembled mammalian genomes makes it possible to compare genome organisation across mammalian lineages and reconstruct chromosomes of the ancestral marsupial and therian (marsupial and eutherian) mammals. However, the reconstruction of ancestral genomes requires genome assemblies to be anchored to chromosomes. The recently sequenced tammar wallaby (Macropus eugenii) genome was assembled into over 300,000 contigs. We previously devised an efficient strategy for mapping large evolutionarily conserved blocks in non-model mammals, and applied this to determine the arrangement of conserved blocks on all wallaby chromosomes, thereby permitting comparative maps to be constructed and resolve the long debated issue between a 2n=14 and 2n=22 ancestral marsupial karyotype. RESULTS: We identified large blocks of genes conserved between human and opossum, and mapped genes corresponding to the ends of these blocks by fluorescence in situ hybridization (FISH). A total of 242 genes was assigned to wallaby chromosomes in the present study, bringing the total number of genes mapped to 554 and making it the most densely cytogenetically mapped marsupial genome. We used these gene assignments to construct comparative maps between wallaby and opossum, which uncovered many intrachromosomal rearrangements, particularly for genes found on wallaby chromosomes X and 3. Expanding comparisons to include chicken and human permitted the putative ancestral marsupial (2n=14) and therian mammal (2n=19) karyotypes to be reconstructed. CONCLUSIONS: Our physical mapping data for the tammar wallaby has uncovered the events shaping marsupial genomes and enabled us to predict the ancestral marsupial karyotype, supporting a 2n=14 ancestor. Futhermore, our predicted therian ancestral karyotype has helped to understand the evolution of the ancestral eutherian genome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phishing, a form of on-line identity theft, is a major problem worldwide, accounting for more than $7.5 Billion in losses in the US alone between 2005 and 2008. Australia was the first country to be targeted by Internet bank phishing in 2003 and continues to have a significant problem in this area. The major cyber crime groups responsible for phishing are based in Eastern Europe. They operate with a large degree of freedom due to the inherent difficulties in cross border law enforcement and the current situation in Eastern Europe, particularly in Russia and the Ukraine. They employ highly sophisticated and efficient technical tools to compromise victims and subvert bank authentication systems. However because it is difficult for them to repatriate the fraudulently obtained funds directly they employ Internet money mules in Australia to transfer the money via Western Union or Money gram. It is proposed a strategy, which firstly places more focus by Australian law enforcement upon transactions via Western Union and Money gram to detect this money laundering, would significantly impact the success of the Phishing attack model. This combined with a technical monitoring of Trojan technology and education of potential Internet money mules to avoid being duped would provide a winning strategy for the war on phishing for Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business processes are an important instrument for understanding and improving how companies provide goods and services to customers. Therefore, many companies have documented their business processes well, often in the Event-driven Process Chains (EPC). Unfortunately, in many cases the resulting EPCs are rather complex, so that the overall process logic is hidden in low level process details. This paper proposes abstraction mechanisms for process models that aim to reduce their complexity, while keeping the overall process structure. We assume that functions are marked with efforts and splits are marked with probabilities. This information is used to separate important process parts from less important ones. Real world process models are used to validate the approach.