99 resultados para average complexity

em Queensland University of Technology - ePrints Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the multi-view approach to semisupervised learning, we choose one predictor from each of multiple hypothesis classes, and we co-regularize our choices by penalizing disagreement among the predictors on the unlabeled data. We examine the co-regularization method used in the co-regularized least squares (CoRLS) algorithm, in which the views are reproducing kernel Hilbert spaces (RKHS's), and the disagreement penalty is the average squared difference in predictions. The final predictor is the pointwise average of the predictors from each view. We call the set of predictors that can result from this procedure the co-regularized hypothesis class. Our main result is a tight bound on the Rademacher complexity of the co-regularized hypothesis class in terms of the kernel matrices of each RKHS. We find that the co-regularization reduces the Rademacher complexity by an amount that depends on the distance between the two views, as measured by a data dependent metric. We then use standard techniques to bound the gap between training error and test error for the CoRLS algorithm. Experimentally, we find that the amount of reduction in complexity introduced by co regularization correlates with the amount of improvement that co-regularization gives in the CoRLS algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We generalize the classical notion of Vapnik–Chernovenkis (VC) dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive Inference. A logical learning paradigm is defined as a set W of structures over some vocabulary, and a set D of first-order formulas that represent data. The sets of models of ϕ in W, where ϕ varies over D, generate a natural topology W over W. We show that if D is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of D in a member of W, with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity—a variation on predictive complexity—and mind change complexity. The assumptions that D is closed under boolean operators and that W is compact often play a crucial role to establish connections between these concepts. We then consider a computable setting with effective versions of the complexity measures, and show that the equivalence between ordinal VC-dimension and predictive complexity fails. More precisely, we prove that the effective ordinal VC-dimension of a paradigm can be defined when all other effective notions of complexity are undefined. On a better note, when W is compact, all effective notions of complexity are defined, though they are not related as in the noncomputable version of the framework.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CFO and I/Q mismatch could cause significant performance degradation to OFDM systems. Their estimation and compensation are generally difficult as they are entangled in the received signal. In this paper, we propose some low-complexity estimation and compensation schemes in the receiver, which are robust to various CFO and I/Q mismatch values although the performance is slightly degraded for very small CFO. These schemes consist of three steps: forming a cosine estimator free of I/Q mismatch interference, estimating I/Q mismatch using the estimated cosine value, and forming a sine estimator using samples after I/Q mismatch compensation. These estimators are based on the perception that an estimate of cosine serves much better as the basis for I/Q mismatch estimation than the estimate of CFO derived from the cosine function. Simulation results show that the proposed schemes can improve system performance significantly, and they are robust to CFO and I/Q mismatch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New product development projects are experiencing increasing internal and external project complexity. Complexity leadership theory proposes that external complexity requires adaptive and enabling leadership, which facilitates opportunity recognition (OR). We ask whether internal complexity also requires OR for increased adaptability. We extend a model of EO and OR to conclude that internal complexity may require more careful OR. This means that leaders of technically or structurally complex projects need to evaluate opportunities more carefully than those in projects with external or technological complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of agriculture in many countries has tended to reduce as their economies move from a resource base to a manufacturing industry base. Although the level of agricultural production in first world countries has increased over the past two decades, this increase has generally been at a less significant rate compared to other sectors of the economies. Despite this increase in secondary and high technology industries, developed countries have continued to encourage and support their agricultural industries. This support has been through both tariffs and price support. Following pressure from developing economies, particularly through the World Trade Organisation (WTO), GATT Uruguay round and the Cairns Group developed countries are now in various stages of winding back or de-coupling agricultural support within their economies. A major concern of farmers in protected agricultural markets is the impact of a free market trade in agricultural commodities on farm incomes, profitability and land values. This paper will analyse both the capital and income performance of the NSW rural land market over the period 1990-1999. This analysis will be based on several rural land use classifications and will compare the total return from rural properties based on the farm income generated by both the average farmer and those farmers considered to be in the top 20% of the various land use areas. The analysis will provide a comprehensive overview of rural production in a free trade economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is increasing agreement that understanding complexity is important for project management because of difficulties associated with decision-making and goal attainment which appear to stem from complexity. However the current operational definitions of complex projects, based upon size and budget, have been challenged and questions have been raised about how complexity can be measured in a robust manner that takes account of structural, dynamic and interaction elements. Thematic analysis of data from 25 in-depth interviews of project managers involved with complex projects, together with an exploration of the literature reveals a wide range of factors that may contribute to project complexity. We argue that these factors contributing to project complexity may define in terms of dimensions, or source characteristics, which are in turn subject to a range of severity factors. In addition to investigating definitions and models of complexity from the literature and in the field, this study also explores the problematic issues of ‘measuring’ or assessing complexity. A research agenda is proposed to further the investigation of phenomena reported in this initial study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While Business Process Management (BPM) is an established discipline, the increased adoption of BPM technology in recent years has introduced new challenges. One challenge concerns dealing with process model complexity in order to improve the understanding of a process model by stakeholders and process analysts. Features for dealing with this complexity can be classified in two categories: 1) those that are solely concerned with the appearance of the model, and 2) those that in essence change the structure of the model. In this paper we focus on the former category and present a collection of patterns that generalize and conceptualize various existing features. The paper concludes with a detailed analysis of the degree of support of a number of state-of-the-art languages and language implementations for these patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of particulate matter on environment and public health have been widely studied in recent years. A number of studies in the medical field have tried to identify the specific effect on human health of particulate exposure, but agreement amongst these studies on the relative importance of the particles’ size and its origin with respect to health effects is still lacking. Nevertheless, air quality standards are moving, as the epidemiological attention, towards greater focus on the smaller particles. Current air quality standards only regulate the mass of particulate matter less than 10 μm in aerodynamic diameter (PM10) and less than 2.5 μm (PM2.5). The most reliable method used in measuring Total Suspended Particles (TSP), PM10, PM2.5 and PM1 is the gravimetric method since it directly measures PM concentration, guaranteeing an effective traceability to international standards. This technique however, neglects the possibility to correlate short term intra-day variations of atmospheric parameters that can influence ambient particle concentration and size distribution (emission strengths of particle sources, temperature, relative humidity, wind direction and speed and mixing height) as well as human activity patterns that may also vary over time periods considerably shorter than 24 hours. A continuous method to measure the number size distribution and total number concentration in the range 0.014 – 20 μm is the tandem system constituted by a Scanning Mobility Particle Sizer (SMPS) and an Aerodynamic Particle Sizer (APS). In this paper, an uncertainty budget model of the measurement of airborne particle number, surface area and mass size distributions is proposed and applied for several typical aerosol size distributions. The estimation of such an uncertainty budget presents several difficulties due to i) the complexity of the measurement chain, ii) the fact that SMPS and APS can properly guarantee the traceability to the International System of Measurements only in terms of number concentration. In fact, the surface area and mass concentration must be estimated on the basis of separately determined average density and particle morphology. Keywords: SMPS-APS tandem system, gravimetric reference method, uncertainty budget, ultrafine particles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unmanned Aerial Vehicles (UAVs) are emerging as an ideal platform for a wide range of civil applications such as disaster monitoring, atmospheric observation and outback delivery. However, the operation of UAVs is currently restricted to specially segregated regions of airspace outside of the National Airspace System (NAS). Mission Flight Planning (MFP) is an integral part of UAV operation that addresses some of the requirements (such as safety and the rules of the air) of integrating UAVs in the NAS. Automated MFP is a key enabler for a number of UAV operating scenarios as it aids in increasing the level of onboard autonomy. For example, onboard MFP is required to ensure continued conformance with the NAS integration requirements when there is an outage in the communications link. MFP is a motion planning task concerned with finding a path between a designated start waypoint and goal waypoint. This path is described with a sequence of 4 Dimensional (4D) waypoints (three spatial and one time dimension) or equivalently with a sequence of trajectory segments (or tracks). It is necessary to consider the time dimension as the UAV operates in a dynamic environment. Existing methods for generic motion planning, UAV motion planning and general vehicle motion planning cannot adequately address the requirements of MFP. The flight plan needs to optimise for multiple decision objectives including mission safety objectives, the rules of the air and mission efficiency objectives. Online (in-flight) replanning capability is needed as the UAV operates in a large, dynamic and uncertain outdoor environment. This thesis derives a multi-objective 4D search algorithm entitled Multi- Step A* (MSA*) based on the seminal A* search algorithm. MSA* is proven to find the optimal (least cost) path given a variable successor operator (which enables arbitrary track angle and track velocity resolution). Furthermore, it is shown to be of comparable complexity to multi-objective, vector neighbourhood based A* (Vector A*, an extension of A*). A variable successor operator enables the imposition of a multi-resolution lattice structure on the search space (which results in fewer search nodes). Unlike cell decomposition based methods, soundness is guaranteed with multi-resolution MSA*. MSA* is demonstrated through Monte Carlo simulations to be computationally efficient. It is shown that multi-resolution, lattice based MSA* finds paths of equivalent cost (less than 0.5% difference) to Vector A* (the benchmark) in a third of the computation time (on average). This is the first contribution of the research. The second contribution is the discovery of the additive consistency property for planning with multiple decision objectives. Additive consistency ensures that the planner is not biased (which results in a suboptimal path) by ensuring that the cost of traversing a track using one step equals that of traversing the same track using multiple steps. MSA* mitigates uncertainty through online replanning, Multi-Criteria Decision Making (MCDM) and tolerance. Each trajectory segment is modeled with a cell sequence that completely encloses the trajectory segment. The tolerance, measured as the minimum distance between the track and cell boundaries, is the third major contribution. Even though MSA* is demonstrated for UAV MFP, it is extensible to other 4D vehicle motion planning applications. Finally, the research proposes a self-scheduling replanning architecture for MFP. This architecture replicates the decision strategies of human experts to meet the time constraints of online replanning. Based on a feedback loop, the proposed architecture switches between fast, near-optimal planning and optimal planning to minimise the need for hold manoeuvres. The derived MFP framework is original and shown, through extensive verification and validation, to satisfy the requirements of UAV MFP. As MFP is an enabling factor for operation of UAVs in the NAS, the presented work is both original and significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Successful product innovation and the ability of companies to continuously improve their innovation processes are rapidly becoming essential requirements for competitive advantage and long-term growth in both manufacturing and service industries. It is now recognized that companies must develop innovation capabilities across all stages of the product development, manufacture, and distribution cycle. These Continuous Product Innovation (CPI) capabilities are closely associated with a company’s knowledge management systems and processes. Companies must develop mechanisms to continuously improve these capabilities over time. Using results of an international survey on CPI practices, sets of companies are identified by similarities in specific contingencies related to their complexity of product, process, technological, and customer interface. Differences between the learning behaviors found present in the company groups and in the levers used to develop and support these behaviors are identified and discussed. This paper also discusses appropriate mechanisms for firms with similar complexities, and some approaches they can use to improve their organizational learning and product innovation.