874 resultados para Filmic approach methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth subproblems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we deal with the problem of packing (orthogonally and without overlapping) identical rectangles in a rectangle. This problem appears in different logistics settings, such as the loading of boxes on pallets, the arrangements of pallets in trucks and the stowing of cargo in ships. We present a recursive partitioning approach combining improved versions of a recursive five-block heuristic and an L-approach for packing rectangles into larger rectangles and L-shaped pieces. The combined approach is able to rapidly find the optimal solutions of all instances of the pallet loading problem sets Cover I and II (more than 50 000 instances). It is also effective for solving the instances of problem set Cover III (almost 100 000 instances) and practical examples of a woodpulp stowage problem, if compared to other methods from the literature. Some theoretical results are also discussed and, based on them, efficient computer implementations are introduced. The computer implementation and the data sets are available for benchmarking purposes. Journal of the Operational Research Society (2010) 61, 306-320. doi: 10.1057/jors.2008.141 Published online 4 February 2009

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate Lie symmetries of the Navier-Stokes equations are used for the applications to scaling phenomenon arising in turbulence. In particular, we show that the Lie symmetries of the Euler equations are inherited by the Navier-Stokes equations in the form of approximate symmetries that allows to involve the Reynolds number dependence into scaling laws. Moreover, the optimal systems of all finite-dimensional Lie subalgebras of the approximate symmetry transformations of the Navier-Stokes are constructed. We show how the scaling groups obtained can be used to introduce the Reynolds number dependence into scaling laws explicitly for stationary parallel turbulent shear flows. This is demonstrated in the framework of a new approach to derive scaling laws based on symmetry analysis [11]-[13].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two techniques, namely UV-vis- and FTIR spectroscopy, have been employed in order to calculate the degree of substitution (DS) of cellulose carboxylic esters, including acetates, CAs, butyrates, CBs, and hexanoates, CHs. Regarding UV-vis spectroscopy, we have employed a novel approach, based on measuring the dependence of lambda(max) of the intra-molecular charge-transfer bands of polarity probes adsorbed on DS of the ester films (solvatochromism). Additionally, we have revisited the use of FTIR for DS determination. Several methods have been used in order to plot Beer`s law graph, namely: Absorption of KBr pellets, pre-coated with CA: reflectance (DRIFTS) of CAs-KBr solid-solid mixtures with, or without the use of 1.4-dicyanobenzene as an internal reference; reflectance of KBr powder pre-coated with CA. The methods indicated are simple, fast, and accurate, requiring much less ester than the titration method. The probe method is independent of the experimental variables examined. (c) 2010 Published by Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Herein, we report a new approach of an FePt nanoparticle formation mechanism studying the evolution of particle size and composition during the synthesis using the modified polyol process. One of the factors limiting their application in ultra-high-density magnetic storage media is the particle-to-particle composition, which affects the A1-to-L1(0) transformation as well as their magnetic properties. There are many controversies in the literature concerning the mechanism of the FePt formation, which seems to be the key to understanding the compositional chemical distribution. Our results convincingly show that, initially, Pt nuclei are formed due to reduction of Pt(acac)(2) by the diol, followed by heterocoagulation of Fe cluster species formed from Fe(acac)(3) thermal decomposition onto the Pt nuclei. Complete reduction of heterocoagulated iron species seems to involve a CO-spillover process, in which the Pt nuclei surface acts as a heterogeneous catalyst, leading to the improvement of the single-particle composition control and allowing a much narrower compositional distribution. Our results show significant decreases in the particle-to-particle composition range, improving the A1-to-L1(0) phase transformation and, consequently, the magnetic properties when compared with other reported methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Friction plays a key role in causing slipperiness as a low coefficient of friction on the road may result in slippery and hazardous conditions. Analyzing the strong relation between friction and accident risk on winter roads is a difficult task. Many weather forecasting organizations use a variety of standard and bespoke methods to predict the coefficient of friction on roads. This article proposes an approach to predict the extent of slipperiness by building and testing an expert system. It estimates the coefficient of friction on winter roads in the province of Dalarna, Sweden using the prevailing weather conditions as a basis. Weather data from the road weather information system, Sweden (RWIS) was used. The focus of the project was to use the expert system as a part of a major project in VITSA, within the domain of intelligent transport systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses examples from a Swedish study to suggest some ways in which cultural variation could be included in studies of thermal comfort. It is shown how only a slight shift of focus and methodological approach could help us discover aspects of human life that add to previous knowledge within comfort research of how human beings perceive and handle warmth and cold. It is concluded that it is not enough for buildings, heating systems and thermal control devices to be energy-efficient in a mere technical sense. If these are to help to decrease, rather than to increase, energy consumption, they have to support those parts of already existing habits and modes of thought that have the potential for low energy use. This is one reason why culture-specific features and emotional cores need to be investigated and deployed into the study and development of thermal comfort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis uses zonal travel cost method (ZTCM) to estimate consumer surplus of Peace & Love festival in Borlänge, Sweden. The study defines counties as zones of origin of the visitors. Visiting rates from each zone are estimated based on survey data. The study is novel due to the fact that mostly TCM has been applied in the environmental and recreational sector, not for short term events, like P&L festival. The analysis shows that travel cost has a significantly negative effect on visiting rate as expected. Even though income has previously shown to be significant in similar studies, it turns out to be insignificant in this study. A point estimate for the total consumer surplus of P&L festival is 35.6 million Swedish kronor. However, this point estimate is associated with high uncertainty since a 95 % confidence interval for it is (17.9, 53.2). It is also important to note that the estimated value only represents one part of the total economic value, the other values of the festival's totaleconomic value have not been estimated in this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic system test methods for heating systems were developed and applied by the institutes SERC and SP from Sweden, INES from France and SPF from Switzerland already before the MacSheep project started. These test methods followed the same principle: a complete heating system – including heat generators, storage, control etc., is installed on the test rig; the test rig software and hardware simulates and emulates the heat load for space heating and domestic hot water of a single family house, while the unit under test has to act autonomously to cover the heat demand during a representative test cycle. Within the work package 2 of the MacSheep project these similar – but different – test methods were harmonized and improved. The work undertaken includes:  • Harmonization of the physical boundaries of the unit under test. • Harmonization of the boundary conditions of climate and load. • Definition of an approach to reach identical space heat load in combination with an autonomous control of the space heat distribution by the unit under test. • Derivation and validation of new six day and a twelve day test profiles for direct extrapolation of test results.   The new harmonized test method combines the advantages of the different methods that existed before the MacSheep project. The new method is a benchmark test, which means that the load for space heating and domestic hot water preparation will be identical for all tested systems, and that the result is representative for the performance of the system over a whole year. Thus, no modelling and simulation of the tested system is needed in order to obtain the benchmark results for a yearly cycle. The method is thus also applicable to products for which simulation models are not available yet. Some of the advantages of the new whole system test method and performance rating compared to the testing and energy rating of single components are:  • Interaction between the different components of a heating system, e.g. storage, solar collector circuit, heat pump, control, etc. are included and evaluated in this test. • Dynamic effects are included and influence the result just as they influence the annual performance in the field. • Heat losses are influencing the results in a more realistic way, since they are evaluated under "real installed" and representative part-load conditions rather than under single component steady state conditions.   The described method is also suited for the development process of new systems, where it replaces time-consuming and costly field testing with the advantage of a higher accuracy of the measured data (compared to the typically used measurement equipment in field tests) and identical, thus comparable boundary conditions. Thus, the method can be used for system optimization in the test bench under realistic operative conditions, i.e. under relevant operating environment in the lab.   This report describes the physical boundaries of the tested systems, as well as the test procedures and the requirements for both the unit under test and the test facility. The new six day and twelve day test profiles are also described as are the validation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: This paper aims to extend and contribute to prior research on the association between company characteristics and choice of capital budgeting methods (CBMs). Design/methodology/approach: A multivariate regression analysis on questionnaire data from 2005 and 2008 is used to study which factors determine the choice of CBMs in Swedish listed companies. Findings: Our results supported hypotheses that Swedish listed companies have become more sophisticated over the years (or at least less unsophisticated) which indicates a closing of the theory-practice gap; that companies with greater leverage used payback more often; and that companies with stricter debt targets and less management ownership employed accounting rate of return more frequent. Moreover, larger companies used CBMs more often. Originality/value: The paper contributes to prior research within this field by being the first Swedish study to examine the association between use of CBMs and as many as twelve independent variables, including changes over time, by using multivariate regression analysis. The results are compared to a US and a continental European study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is a preliminary investigation into the application of the formal-logical theory of normative positions to the characterisation of normative-informational positions, pertaining to rules that are meant to regulate the supply of information. First, we present the proposed framework. Next, we identify the kinds of nuances and distinctions that can be articulated in such a logical framework. Finally, we show how such nuances can arise in specific regulations. Reference is made to Data Protection Law and Contract Law, among others. The proposed approach is articulated around two essential steps. The first involves identifying the set of possible interpretations that can be given to a particular norm. This is done by using formal methods. The second involves picking out one of these interpretations as the most likely one. This second step can be resolved only by using further information (e.g., the context or other parts of the regulation).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Canada releases over 150 billion litres of untreated and undertreated wastewater into the water environment every year1. To clean up urban wastewater, new Federal Wastewater Systems Effluent Regulations (WSER) on establishing national baseline effluent quality standards that are achievable through secondary wastewater treatment were enacted on July 18, 2012. With respect to the wastewater from the combined sewer overflows (CSO), the Regulations require the municipalities to report the annual quantity and frequency of effluent discharges. The City of Toronto currently has about 300 CSO locations within an area of approximately 16,550 hectares. The total sewer length of the CSO area is about 3,450 km and the number of sewer manholes is about 51,100. A system-wide monitoring of all CSO locations has never been undertaken due to the cost and practicality. Instead, the City has relied on estimation methods and modelling approaches in the past to allow funds that would otherwise be used for monitoring to be applied to the reduction of the impacts of the CSOs. To fulfill the WSER requirements, the City is now undertaking a study in which GIS-based hydrologic and hydraulic modelling is the approach. Results show the usefulness of this for 1) determining the flows contributing to the combined sewer system in the local and trunk sewers for dry weather flow, wet weather flow, and snowmelt conditions; 2) assessing hydraulic grade line and surface water depth in all the local and trunk sewers under heavy rain events; 3) analysis of local and trunk sewer capacities for future growth; and 4) reporting of the annual quantity and frequency of CSOs as per the requirements in the new Regulations. This modelling approach has also allowed funds to be applied toward reducing and ultimately eliminating the adverse impacts of CSOs rather than expending resources on unnecessary and costly monitoring.