830 resultados para Gradient-based approaches


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-05

Relevância:

80.00% 80.00%

Publicador:

Resumo:

By examining the work of several NGOs in the context of post-conflict reconstruction in Bosnia and Herzegovina (BiH), this essay scrutinizes both the potential and limits of NGO contributions to peace-settlements and long-term stability. While their ability to specialize and reach the grassroots level is of great practical significance, the contribution of NGOs to the reconstruction of war-torn societies is often idealized. NGOs remain severely limited by ad hoc and project-specific funding sources, as well as by the overall policy environment in which they operate. Unless these underlying issues are addressed, NGOs will ultimately become little more than extensions of prevalent multilateral and state-based approaches to post-conflict reconstruction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Minimum/maximum autocorrelation factor (MAF) is a suitable algorithm for orthogonalization of a vector random field. Orthogonalization avoids the use of multivariate geostatistics during joint stochastic modeling of geological attributes. This manuscript demonstrates in a practical way that computation of MAF is the same as discriminant analysis of the nested structures. Mathematica software is used to illustrate MAF calculations from a linear model of coregionalization (LMC) model. The limitation of two nested structures in the LMC for MAF is also discussed and linked to the effects of anisotropy and support. The analysis elucidates the matrix properties behind the approach and clarifies relationships that may be useful for model-based approaches. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates the non-linear bending behaviour of functionally graded plates that are bonded with piezoelectric actuator layers and subjected to transverse loads and a temperature gradient based on Reddy's higher-order shear deformation plate theory. The von Karman-type geometric non-linearity, piezoelectric and thermal effects are included in mathematical formulations. The temperature change is due to a steady-state heat conduction through the plate thickness. The material properties are assumed to be graded in the thickness direction according to a power-law distribution in terms of the volume fractions of the constituents. The plate is clamped at two opposite edges, while the remaining edges can be free, simply supported or clamped. Differential quadrature approximation in the X-axis is employed to convert the partial differential governing equations and the associated boundary conditions into a set of ordinary differential equations. By choosing the appropriate functions as the displacement and stress functions on each nodal line and then applying the Galerkin procedure, a system of non-linear algebraic equations is obtained, from which the non-linear bending response of the plate is determined through a Picard iteration scheme. Numerical results for zirconia/aluminium rectangular plates are given in dimensionless graphical form. The effects of the applied actuator voltage, the volume fraction exponent, the temperature gradient, as well as the characteristics of the boundary conditions are also studied in detail. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A major problem in de novo design of enzyme inhibitors is the unpredictability of the induced fit, with the shape of both ligand and enzyme changing cooperatively and unpredictably in response to subtle structural changes within a ligand. We have investigated the possibility of dampening the induced fit by using a constrained template as a replacement for adjoining segments of a ligand. The template preorganizes the ligand structure, thereby organizing the local enzyme environment. To test this approach, we used templates consisting of constrained cyclic tripeptides, formed through side chain to main chain linkages, as structural mimics of the protease-bound extended beta-strand conformation of three adjoining amino acid residues at the N- or C-terminal sides of the scissile bond of substrates. The macrocyclic templates were derivatized to a range of 30 structurally diverse molecules via focused combinatorial variation of nonpeptidic appendages incorporating a hydroxyethylamine transition-state isostere. Most compounds in the library were potent inhibitors of the test protease (HIV-1 protease). Comparison of crystal structures for five protease-inhibitor complexes containing an N-terminal macrocycle and three protease-inhibitor complexes containing a C-terminal macrocycle establishes that the macrocycles fix their surrounding enzyme environment, thereby permitting independent variation of acyclic inhibitor components with only local disturbances to the protease. In this way, the location in the protease of various acyclic fragments on either side of the macrocyclic template can be accurately predicted. This type of templating strategy minimizes the problem of induced fit, reducing unpredictable cooperative effects in one inhibitor region caused by changes to adjacent enzyme-inhibitor interactions. This idea might be exploited in template-based approaches to inhibitors of other proteases, where a beta-strand mimetic is also required for recognition, and also other protein-binding ligands where different templates may be more appropriate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Substantial amounts of nitrogen (N) fertiliser are necessary for commercial sugarcane production because of the large biomass produced by sugarcane crops. Since this fertiliser is a substantial input cost and has implications if N is lost to the environment, there are pressing needs to optimise the supply of N to the crops' requirements. The complexity of the N cycle and the strong influence of climate, through its moderation of N transformation processes in the soil and its impact on N uptake by crops, make simulation-based approaches to this N management problem attractive. In this paper we describe the processes to be captured in modelling soil and plant N dynamics in sugarcane systems, and review the capability for modelling these processes. We then illustrate insights gained into improved management of N through simulation-based studies for the issues of crop residue management, irrigation management and greenhouse gas emissions. We conclude by identifying processes not currently represented in the models used for simulating N cycling in sugarcane production systems, and illustrate ways in which these can be partially overcome in the short term. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Wet Tropics World Heritage Area in Far North Queens- land, Australia consists predominantly of tropical rainforest and wet sclerophyll forest in areas of variable relief. Previous maps of vegetation communities in the area were produced by a labor-intensive combination of field survey and air-photo interpretation. Thus,. the aim of this work was to develop a new vegetation mapping method based on imaging radar that incorporates topographical corrections, which could be repeated frequently, and which would reduce the need for detailed field assessments and associated costs. The method employed G topographic correction and mapping procedure that was developed to enable vegetation structural classes to be mapped from satellite imaging radar. Eight JERS-1 scenes covering the Wet Tropics area for 1996 were acquired from NASDA under the auspices of the Global Rainforest Mapping Project. JERS scenes were geometrically corrected for topographic distortion using an 80 m DEM and a combination of polynomial warping and radar viewing geometry modeling. An image mosaic was created to cover the Wet Tropics region, and a new technique for image smoothing was applied to the JERS texture bonds and DEM before a Maximum Likelihood classification was applied to identify major land-cover and vegetation communities. Despite these efforts, dominant vegetation community classes could only be classified to low levels of accuracy (57.5 percent) which were partly explained by the significantly larger pixel size of the DEM in comparison to the JERS image (12.5 m). In addition, the spatial and floristic detail contained in the classes of the original validation maps were much finer than the JERS classification product was able to distinguish. In comparison to field and aerial photo-based approaches for mapping the vegetation of the Wet Tropics, appropriately corrected SAR data provides a more regional scale, all-weather mapping technique for broader vegetation classes. Further work is required to establish an appropriate combination of imaging radar with elevation data and other environmental surrogates to accurately map vegetation communities across the entire Wet Tropics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The recent deregulation in electricity markets worldwide has heightened the importance of risk management in energy markets. Assessing Value-at-Risk (VaR) in electricity markets is arguably more difficult than in traditional financial markets because the distinctive features of the former result in a highly unusual distribution of returns-electricity returns are highly volatile, display seasonalities in both their mean and volatility, exhibit leverage effects and clustering in volatility, and feature extreme levels of skewness and kurtosis. With electricity applications in mind, this paper proposes a model that accommodates autoregression and weekly seasonals in both the conditional mean and conditional volatility of returns, as well as leverage effects via an EGARCH specification. In addition, extreme value theory (EVT) is adopted to explicitly model the tails of the return distribution. Compared to a number of other parametric models and simple historical simulation based approaches, the proposed EVT-based model performs well in forecasting out-of-sample VaR. In addition, statistical tests show that the proposed model provides appropriate interval coverage in both unconditional and, more importantly, conditional contexts. Overall, the results are encouraging in suggesting that the proposed EVT-based model is a useful technique in forecasting VaR in electricity markets. (c) 2005 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The estimation of a concentration-dependent diffusion coefficient in a drying process is known as an inverse coefficient problem. The solution is sought wherein the space-average concentration is known as function of time (mass loss monitoring). The problem is stated as the minimization of a functional and gradient-based algorithms are used to solve it. Many numerical and experimental examples that demonstrate the effectiveness of the proposed approach are presented. Thin slab drying was carried out in an isothermal drying chamber built in our laboratory. The diffusion coefficients of fructose obtained with the present method are compared with existing literature results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers a model-based approach to the clustering of tissue samples of a very large number of genes from microarray experiments. It is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. Frequently in practice, there are also clinical data available on those cases on which the tissue samples have been obtained. Here we investigate how to use the clinical data in conjunction with the microarray gene expression data to cluster the tissue samples. We propose two mixture model-based approaches in which the number of components in the mixture model corresponds to the number of clusters to be imposed on the tissue samples. One approach specifies the components of the mixture model to be the conditional distributions of the microarray data given the clinical data with the mixing proportions also conditioned on the latter data. Another takes the components of the mixture model to represent the joint distributions of the clinical and microarray data. The approaches are demonstrated on some breast cancer data, as studied recently in van't Veer et al. (2002).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a critical comparison of major changes in engineering education in both Australia and Europe. European engineering programs are currently being reshaped by the Bologna process, representing a move towards quality assurance in higher education and the mutual recognition of degrees among universities across Europe. Engineering education in Australia underwent a transformation after the 1996 review of engineering education1. The paper discusses the recent European developments in order to give up-to-date information on this fast changing and sometimes obscure process. The comparison draws on the implications of the Bologna Process on the German engineering education system as an example. It concludes with issues of particular interest, which can help to inform the international discussion on how to meet today’s challenges for engineering education. These issues include ways of achieving diversityamong engineering programs, means of enabling student and staff mobility, and the preparation of engineering students for professional practic e through engineering education. As a result, the benefits of outcomes based approaches in education are discussed. This leads to an outlook for further research into the broader attributes required by future professional engineers. © 2005, Australasian Association for Engineering Education

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experiments with simulators allow psychologists to better understand the causes of human errors and build models of cognitive processes to be used in human reliability assessment (HRA). This paper investigates an approach to task failure analysis based on patterns of behaviour, by contrast to more traditional event-based approaches. It considers, as a case study, a formal model of an air traffic control (ATC) system which incorporates controller behaviour. The cognitive model is formalised in the CSP process algebra. Patterns of behaviour are expressed as temporal logic properties. Then a model-checking technique is used to verify whether the decomposition of the operator's behaviour into patterns is sound and complete with respect to the cognitive model. The decomposition is shown to be incomplete and a new behavioural pattern is identified, which appears to have been overlooked in the analysis of the data provided by the experiments with the simulator. This illustrates how formal analysis of operator models can yield fresh insights into how failures may arise in interactive systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Collaborative filtering is regarded as one of the most promising recommendation algorithms. The item-based approaches for collaborative filtering identify the similarity between two items by comparing users' ratings on them. In these approaches, ratings produced at different times are weighted equally. That is to say, changes in user purchase interest are not taken into consideration. For example, an item that was rated recently by a user should have a bigger impact on the prediction of future user behaviour than an item that was rated a long time ago. In this paper, we present a novel algorithm to compute the time weights for different items in a manner that will assign a decreasing weight to old data. More specifically, the users' purchase habits vary. Even the same user has quite different attitudes towards different items. Our proposed algorithm uses clustering to discriminate between different kinds of items. To each item cluster, we trace each user's purchase interest change and introduce a personalized decay factor according to the user own purchase behaviour. Empirical studies have shown that our new algorithm substantially improves the precision of item-based collaborative filtering without introducing higher order computational complexity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Around the world, consumers and retailers of fresh produce are becoming more and more discerning about factors such as food safety and traceability, health, convenience and the sustainability of production systems, and in doing so they are changing the way in which fresh produce supply chains are configured and managed. When consumers demand fresh, safe, convenient, value-for-money produce, retailers in an increasingly competitive environment are attracted to those business models most capable of meeting these demands profitably. Traditional models are proving less and less able to deliver competitive advantage in such an environment. As a result, opportunistic, adversarial, price-based approaches to doing business between chain members are being replaced by approaches that are more strategic, collaborative and value-based. The shaping force behind this change is the need for producers, wholesalers, category managers, retailers and consumers to have more certainty about the performance of the supply chains upon which they rely. Certainty is generated through the supply chain's ability to create, deliver and share value. How to build supply chains that create, deliver and share value is arguably the single biggest challenge to the competitiveness of fresh produce firms, and therefore to the industries to which they belong.