877 resultados para new method


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose a new method of measuring the very slow paramagnetic ion diffusion coefficient using a commercial high-resolution spectrometer. If there are distinct paramagnetic ions influencing the hydrogen nuclear magnetic relaxation time differently, their diffusion coefficients can be measured separately. A cylindrical phantom filled with Fricke xylenol gel solution and irradiated with gamma rays was used to validate the method. The Fricke xylenol gel solution was prepared with 270 Bloom porcine gelatin, the phantom was irradiated with gamma rays originated from a (60)Co source and a high-resolution 200 MHz nuclear magnetic resonance (NMR) spectrometer was used to obtain the phantom (1)H profile in the presence of a linear magnetic field gradient. By observing the temporal evolution of the phantom NMR profile, an apparent ferric ion diffusion coefficient of 0.50 mu m(2)/ms due to ferric ions diffusion was obtained. In any medical process where the ionizing radiation is used, the dose planning and the dose delivery are the key elements for the patient safety and success of treatment. These points become even more important in modern conformal radio therapy techniques, such as stereotactic radiosurgery, where the delivered dose in a single session of treatment can be an order of magnitude higher than the regular doses of radiotherapy. Several methods have been proposed to obtain the three-dimensional (3-D) dose distribution. Recently, we proposed an alternative method for the 3-D radiation dose mapping, where the ionizing radiation modifies the local relative concentration of Fe(2+)/Fe(3+) in a phantom containing Fricke gel and this variation is associated to the MR image intensity. The smearing of the intensity gradient is proportional to the diffusion coefficient of the Fe(3+) and Fe(2+) in the phantom. There are several methods for measurement of the ionic diffusion using NMR, however, they are applicable when the diffusion is not very slow.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper completes the review of the theory of self-adjoint extensions of symmetric operators for physicists as a basis for constructing quantum-mechanical observables. It contains a comparative presentation of the well-known methods and a newly proposed method for constructing ordinary self-adjoint differential operators associated with self-adjoint differential expressions in terms of self-adjoint boundary conditions. The new method has the advantage that it does not require explicitly evaluating deficient subspaces and deficiency indices (these latter are determined in passing) and that boundary conditions are of explicit character irrespective of the singularity of a differential expression. General assertions and constructions are illustrated by examples of well-known quantum-mechanical operators like momentum and Hamiltonian.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Although the oral cavity is easily accessible to inspection, patients with oral cancer most often present at a late stage, leading to high morbidity and mortality. Autofluorescence imaging has emerged as a promising technology to aid clinicians in screening for oral neoplasia and as an aid to resection, but current approaches rely on subjective interpretation. We present a new method to objectively delineate neoplastic oral mucosa using autofluorescence imaging. Autofluorescence images were obtained from 56 patients with oral lesions and 11 normal volunteers. From these images, 276 measurements from 159 unique regions of interest (ROI) sites corresponding to normal and confirmed neoplastic areas were identified. Data from ROIs in the first 46 subjects were used to develop a simple classification algorithm based on the ratio of red-to-green fluorescence; performance of this algorithm was then validated using data from the ROIs in the last 21 subjects. This algorithm was applied to patient images to create visual disease probability maps across the field of view. Histologic sections of resected tissue were used to validate the disease probability maps. The best discrimination between neoplastic and nonneoplastic areas was obtained at 405 nm excitation; normal tissue could be discriminated from dysplasia and invasive cancer with a 95.9% sensitivity and 96.2% specificity in the training set, and with a 100% sensitivity and 91.4% specificity in the validation set. Disease probability maps qualitatively agreed with both clinical impression and histology. Autofluorescence imaging coupled with objective image analysis provided a sensitive and noninvasive tool for the detection of oral neoplasia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new method for characterization and analysis of asphaltic mixtures aggregate particles is reported. By relying on multiscale representation of the particles, curvature estimation, and discriminant analysis for optimal separation of the categories of mixtures, a particularly effective and comprehensive methodology is obtained. The potential of the methodology is illustrated with respect to three important types of particles used in asphaltic mixtures, namely basalt, gabbro, and gravel. The obtained results show that gravel particles are markedly distinct from the other two types of particles, with the gabbro category resulting with intermediate geometrical properties. The importance of each considered measurement in the discrimination between the three categories of particles was also quantified in terms of the adopted discriminant analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Nonlinear Programming algorithm that converges to second-order stationary points is introduced in this paper. The main tool is a second-order negative-curvature method for box-constrained minimization of a certain class of functions that do not possess continuous second derivatives. This method is used to define an Augmented Lagrangian algorithm of PHR (Powell-Hestenes-Rockafellar) type. Convergence proofs under weak constraint qualifications are given. Numerical examples showing that the new method converges to second-order stationary points in situations in which first-order methods fail are exhibited.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this article is to present a new method to predict the response variable of an observation in a new cluster for a multilevel logistic regression. The central idea is based on the empirical best estimator for the random effect. Two estimation methods for multilevel model are compared: penalized quasi-likelihood and Gauss-Hermite quadrature. The performance measures for the prediction of the probability for a new cluster observation of the multilevel logistic model in comparison with the usual logistic model are examined through simulations and an application.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new method is presented for spectrophotometric determination of total polyphenols content in wine. The procedure is a modified CUPRAC method based on the reduction of Cu(II), in hydroethanolic medium (pH 7.0) in the presence of neocuproine (2,9-dimethyl-1,10-phenanthroline), by polyphenols, yielding a Cu(I) complexes with maximum absorption peak at 450 nm. The absorbance values are linear (r = 0.998, n = 6) with tannic acid concentrations from 0.4 to 3.6 mu mol L(-1). The limit of detection obtained was 0.41 mu mol L(-1) and relative standard deviation 1.2% (1 mu mol L(-1); n = 8). Recoveries between 80% and 110% (mean value of 95%) were calculated for total polyphenols determination in 14 commercials and 2 synthetic wine samples (with and without sulphite). The proposed procedure is about 1.5 more sensitive than the official Folin-Ciocalteu method. The sensitivities of both methods were compared by the analytical responses of several polyphenols tested in each method. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new method for the preparation of alpha,beta-unsaturated diazoketones from aldehydes and a Horner-Wadsworth- Emmons reagent is reported. The method was applied to the short synthesis of two substituted pyrrolidines.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

SammanfattningHögskolan Dalarna har i samarbete med Skogsägarna Mellanskog, Naturbränsle i Mellan¬sverige AB och GDE-Net genomfört studier på en ny metod för uttag av skogsbränsle från slutavverkningar. Metoden går ut på att timmer tas ut som enda rundvirkessortiment. Resten av trädet, samt klenare träd som inte håller timmerdimension, tas ut som ett bränslesortiment. Metoden har jämförts med en konventionell slutavverkning med uttag av timmer, massaved och GROT-flis.Enligt genomförda försök skulle en avverkning enligt den nya metoden (långa toppar) ge ett högre drivningsnetto och drygt dubbelt så mycket bränsleflis som en konventionell avverk¬ning. En anledning till det högre drivningsnettot är att kostnaden för flisning blir lägre än vid flisning av GROT och att flisen betalas bättre än GROT-flis. Resultaten är beroende av de faktiska beståndsförutsättningarna och gällande prisrelationer mellan massaved och bränsle¬flis.Faktorer som har en positiv inverkan på drivningsnettot vid uttag av ”långa toppar” är t.ex. stora uttagsvolymer och korta terrängtransportavstånd samt bestånd med en hög andel virke av låg kvalitet eller udda sortiment som betalas dåligt på rundvirkesmarknaden.SummaryIn Sweden forest energy from final felling is traditionally harvested as logging residues after harvesting of timber (saw logs) and pulpwood, but depending on the market situation other methods with higher yield of forest energy might be of interest. Dalarna University has study a new method called “Undelimbed long tops” where only saw timber was taken out as an industrial assortment. The rest of the trees and smaller trees that don’t hold timber dimensions was left intact on the clear-felled area and been chipped later on. The study was done in different stands with some different conditions. The results have been compared with the traditional method for final felling. The surplus (forest owners net income) was higher in almost all stands when the method with “undelimbed long tops” was used, compared to the traditional method for taking out forest energy, and the volume of chips was more than doubled. A reason for the higher income from long tops is that the costs for chipping is lower and the prize of chips is higher compared to chips from logging residues. Other reason is that forest owners will not be paid for wasted pulpwood, but will be fully paid for the chips from such pulpwood. Factors that will have a positive influence on the ULT-method are for example large logging volumes and short distance between the logging area and the landing, different kinds of price reductions on pulpwood and large volumes of rotten wood or low paid industrial assortments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The cost of a road construction over its service life is a function of the design, quality of construction, maintenance strategies and maintenance operations. Unfortunately, designers often neglect a very important aspect which is the possibility to perform future maintenance activities. The focus is mainly on other aspects such as investment costs, traffic safety, aesthetic appearance, regional development and environmental effects. This licentiate thesis is a part of a Ph.D. project entitled “Road Design for lower maintenance costs” that aims to examine how the life-cycle costs can be optimized by selection of appropriate geometrical designs for the roads and their components. The result is expected to give a basis for a new method used in the road planning and design process using life-cycle cost analysis with particular emphasis on road maintenance. The project started with a review of literature with the intention to study conditions causing increased needs for road maintenance, the efforts made by the road authorities to satisfy those needs and the improvement potential by consideration of maintenance aspects during planning and design. An investigation was carried out to identify the problems which obstruct due consideration of maintenance aspects during the road planning and design process. This investigation focused mainly on the road planning and design process at the Swedish Road Administration. However, the road planning and design process in Denmark, Finland and Norway were also roughly evaluated to gain a broader knowledge about the research subject. The investigation was carried out in two phases: data collection and data analysis. Data was collected by semi-structured interviews with expert actors involved in planning, design and maintenance and by a review of design-related documents. Data analyses were carried out using a method called “Change Analysis”. This investigation revealed a complex combination of problems which result in inadequate consideration of maintenance aspects. Several urgent needs for changes to eliminate these problems were identified. Another study was carried out to develop a model for calculation of the repair costs for damages of different road barrier types and to analyse how factors such as road type, speed limits, barrier types, barrier placement, type of road section, alignment and seasonal effects affect the barrier damages and the associated repair costs. This study was carried out using a method called the “Case Study Research Method”. Data was collected from 1087 barrier repairs in two regional offices of the Swedish Road Administration, the Central Region and the Western Region. A table was established for both regions containing the repair cost per vehicle kilometre for different combinations of barrier types, road types and speed limits. This table can be used by the designers in the calculation of the life-cycle costs for different road barrier types.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Combinatorial optimization problems, are one of the most important types of problems in operational research. Heuristic and metaheuristics algorithms are widely applied to find a good solution. However, a common problem is that these algorithms do not guarantee that the solution will coincide with the optimum and, hence, many solutions to real world OR-problems are afflicted with an uncertainty about the quality of the solution. The main aim of this thesis is to investigate the usability of statistical bounds to evaluate the quality of heuristic solutions applied to large combinatorial problems. The contributions of this thesis are both methodological and empirical. From a methodological point of view, the usefulness of statistical bounds on p-median problems is thoroughly investigated. The statistical bounds have good performance in providing informative quality assessment under appropriate parameter settings. Also, they outperform the commonly used Lagrangian bounds. It is demonstrated that the statistical bounds are shown to be comparable with the deterministic bounds in quadratic assignment problems. As to empirical research, environment pollution has become a worldwide problem, and transportation can cause a great amount of pollution. A new method for calculating and comparing the CO2-emissions of online and brick-and-mortar retailing is proposed. It leads to the conclusion that online retailing has significantly lesser CO2-emissions. Another problem is that the Swedish regional division is under revision and the border effect to public service accessibility is concerned of both residents and politicians. After analysis, it is shown that borders hinder the optimal location of public services and consequently the highest achievable economic and social utility may not be attained.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dynamic system test methods for heating systems were developed and applied by the institutes SERC and SP from Sweden, INES from France and SPF from Switzerland already before the MacSheep project started. These test methods followed the same principle: a complete heating system – including heat generators, storage, control etc., is installed on the test rig; the test rig software and hardware simulates and emulates the heat load for space heating and domestic hot water of a single family house, while the unit under test has to act autonomously to cover the heat demand during a representative test cycle. Within the work package 2 of the MacSheep project these similar – but different – test methods were harmonized and improved. The work undertaken includes:  • Harmonization of the physical boundaries of the unit under test. • Harmonization of the boundary conditions of climate and load. • Definition of an approach to reach identical space heat load in combination with an autonomous control of the space heat distribution by the unit under test. • Derivation and validation of new six day and a twelve day test profiles for direct extrapolation of test results.   The new harmonized test method combines the advantages of the different methods that existed before the MacSheep project. The new method is a benchmark test, which means that the load for space heating and domestic hot water preparation will be identical for all tested systems, and that the result is representative for the performance of the system over a whole year. Thus, no modelling and simulation of the tested system is needed in order to obtain the benchmark results for a yearly cycle. The method is thus also applicable to products for which simulation models are not available yet. Some of the advantages of the new whole system test method and performance rating compared to the testing and energy rating of single components are:  • Interaction between the different components of a heating system, e.g. storage, solar collector circuit, heat pump, control, etc. are included and evaluated in this test. • Dynamic effects are included and influence the result just as they influence the annual performance in the field. • Heat losses are influencing the results in a more realistic way, since they are evaluated under "real installed" and representative part-load conditions rather than under single component steady state conditions.   The described method is also suited for the development process of new systems, where it replaces time-consuming and costly field testing with the advantage of a higher accuracy of the measured data (compared to the typically used measurement equipment in field tests) and identical, thus comparable boundary conditions. Thus, the method can be used for system optimization in the test bench under realistic operative conditions, i.e. under relevant operating environment in the lab.   This report describes the physical boundaries of the tested systems, as well as the test procedures and the requirements for both the unit under test and the test facility. The new six day and twelve day test profiles are also described as are the validation results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim was to evaluate results and experiences from development of new technology, a training program and implementation of strategies for the use of a video exposure monitoring method, PIMEX. Starting point of this study is an increased incidence of asthma among workers in the aluminium industry. Exposure peaks of fumes are supposed to play an important role. PIMEX makes it possible to link used work practice, use of control technology, and so forth to peaks. Nine companies participated in the project, which was divided into three parts, development of PIMEX technology, production of training material, and training in use of equipment and related strategies. The use of the video exposure monitoring method PIMEX offers prerequisites supporting workers participation in safety activities. The experiences from the project reveal the importance of good timing of primary training, technology development, technical support, and follow up training. In spite of a delay of delivery of the new technology, representatives from the participating companies declared that the experiences showed that PIMEX gave an important contribution for effective control of hazards in the companies. Eight out of nine smelters used the PIMEX method as a part of a strategy for control of workers exposure to fumes in potrooms. Possibilities to conduct effective control measures were identified. This article describes experiences from implementation of a, for this branch, new method supporting workers participation for workplace improvements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose a new method for solving large scale p-median problem instances based on real data. We compare different approaches in terms of runtime, memory footprint and quality of solutions obtained. In order to test the different methods on real data, we introduce a new benchmark for the p-median problem based on real Swedish data. Because of the size of the problem addressed, up to 1938 candidate nodes, a number of algorithms, both exact and heuristic, are considered. We also propose an improved hybrid version of a genetic algorithm called impGA. Experiments show that impGA behaves as well as other methods for the standard set of medium-size problems taken from Beasley’s benchmark, but produces comparatively good results in terms of quality, runtime and memory footprint on our specific benchmark based on real Swedish data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A presente dissertação enfoca, a partir de um estudo de caso, o Programa de "Gestão da Produtividade Aplicada aos Correios" – GPAC –, numa empresa pública, a Empresa Brasileira de Correios e Telégrafos. Este trabalho tem por objetivo analisar criticamente um método para introduzir conceitos, princípios e técnicas de Administração e Engenharia da Produção, a partir de uma ótica de uma organização que aprende – learning organization –, tendo como objeto de análise a ECT. Parte-se de uma perspectiva histórica que procura abordar a evolução do programa "Gestão da Produtividade Aplicada aos Correios" na Organização, desde sua fase inicial, em 1995, até o seu desenvolvimento no momento atual. Descreve-se as principais atividades desencadeadas, seguindo sua ordem cronológica. Após uma fundamentação teórica, descrevendo abordagens de diversos autores sobre aprendizagem organizacional, é analisado o programa GPAC à luz dos pressupostos de organizações de aprendizagem preconizados por Garvin, propondo-se um novo método. Neste novo método são apresentadas algumas alterações e inclusões em relação ao método adotado no programa GPAC, visando auxiliar na eficácia da disseminação e implantação de novos conceitos, princípios e técnicas de Administração e Engenharia de Produção na Empresa.