943 resultados para Pechini method and chromium


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2015

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contemporary cities are frequently surrounded by transitional landscapes: ambiguous lands, non-places on the urban edge, commonly experienced under the condition of speed. Although variously shaped by processes of urbanisation, logistics of road engineering, safety and ownership, and local people's lives, for travellers such landscapes are usually perceived in a state of disappearance. This condition presents a major challenge for the traditional methods used in architecture and urban design. For designers interested in the organisation and design of such mobility routes for the engagement of the traveller, a method of scripting based on notation timelines would provide a helpful supplement to traditional master plans. This paper explores the development of such a method and its roots in time-based arts, such as dance, music and film, as well as in the recent history of architecture and urban design. It does so through the presentation of an experimental study based on a real route, the train journey from London to Stansted airport.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This short commentary outlines psychoanalysis as a theory and method and its potential value to media research. Following Dahlgren (2013), it is suggested that psychoanalysis may enrich the field because it may offer a complex theory of the human subject, as well as methodological means of doing justice to the richness, ambivalence and contradictions of human experience. The psychoanalytic technique of free association and how it has been adapted in social research (Hollway and Jefferson 2000) is suggested as a means to open up subjective modes of expression and thinking – in researchers and research participants alike – that lie beyond rationality and conscious agency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To be legitimate, research needs to be ethical, methodologically sound, of sufficient value to justify public expenditure and be transparent. Animal research has always been contested on ethical grounds, but there is now mounting evidence of poor scientific method, and growing doubts about its clinical value. So what of transparency? Here we examine the increasing focus on openness within animal research in the UK, analysing recent developments within the Home Office and within the main group representing the interests of the sector, Understanding Animal Research. We argue that, while important steps are being taken toward greater transparency, the legitimacy of animal research continues to be undermined by selective openness. We propose that openness could be increased through public involvement, and that this would bring about much needed improvements in animal research, as it has done in clinical research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aiming the establishment of simple and accurate readings of citric acid (CA) in complex samples, citrate (CIT) selective electrodes with tubular configuration and polymeric membranes plus a quaternary ammonium ion exchanger were constructed. Several selective membranes were prepared for this purpose, having distinct mediator solvents (with quite different polarities) and, in some cases, p-tert-octylphenol (TOP) as additive. The latter was used regarding a possible increase in selectivity. The general working characteristics of all prepared electrodes were evaluated in a low dispersion flow injection analysis (FIA) manifold by injecting 500µl of citrate standard solutions into an ionic strength (IS) adjuster carrier (10−2 mol l−1) flowing at 3ml min−1. Good potentiometric response, with an average slope and a repeatability of 61.9mV per decade and ±0.8%, respectively, resulted from selective membranes comprising additive and bis(2-ethylhexyl)sebacate (bEHS) as mediator solvent. The same membranes conducted as well to the best selectivity characteristics, assessed by the separated solutions method and for several chemical species, such as chloride, nitrate, ascorbate, glucose, fructose and sucrose. Pharmaceutical preparations, soft drinks and beers were analyzed under conditions that enabled simultaneous pH and ionic strength adjustment (pH = 3.2; ionic strength = 10−2 mol l−1), and the attained results agreed well with the used reference method (relative error < 4%). The above experimental conditions promoted a significant increase in sensitivity of the potentiometric response, with a supra-Nernstian slope of 80.2mV per decade, and allowed the analysis of about 90 samples per hour, with a relative standard deviation <1.0%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radio frequency (RF) energy harvesting is an emerging technology that will enable to drive the next generation of wireless sensor networks (WSNs) without the need of using batteries. In this paper, we present RF energy harvesting circuits specifically developed for GSM bands (900/1800) and a wearable dual-band antenna suitable for possible implementation within clothes for body worn applications. Besides, we address the development and experimental characterization of three different prototypes of a five-stage Dickson voltage multiplier (with match impedance circuit) responsible for harvesting the RF energy. Different printed circuit board (PCB) fabrication techniques to produce the prototypes result in different values of conversion efficiency. Therefore, we conclude that if the PCB fabrication is achieved by means of a rigorous control in the photo-positive method and chemical bath procedure applied to the PCB it allows for attaining better values for the conversion efficiency. All three prototypes (1, 2 and 3) can power supply the IRIS sensor node for RF received powers of -4 dBm, -6 dBm and -5 dBm, and conversion efficiencies of 20, 32 and 26%, respectively. © 2014 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constrained nonlinear optimization problems are usually solved using penalty or barrier methods combined with unconstrained optimization methods. Another alternative used to solve constrained nonlinear optimization problems is the lters method. Filters method, introduced by Fletcher and Ley er in 2002, have been widely used in several areas of constrained nonlinear optimization. These methods treat optimization problem as bi-objective attempts to minimize the objective function and a continuous function that aggregates the constraint violation functions. Audet and Dennis have presented the rst lters method for derivative-free nonlinear programming, based on pattern search methods. Motivated by this work we have de- veloped a new direct search method, based on simplex methods, for general constrained optimization, that combines the features of the simplex method and lters method. This work presents a new variant of these methods which combines the lters method with other direct search methods and are proposed some alternatives to aggregate the constraint violation functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phenylketonuria is an inborn error of metabolism, involving, in most cases, a deficient activity of phenylalanine hydroxylase. Neonatal diagnosis and a prompt special diet (low phenylalanine and natural-protein restricted diets) are essential to the treatment. The lack of data concerning phenylalanine contents of processed foodstuffs is an additional limitation for an already very restrictive diet. Our goals were to quantify protein (Kjeldahl method) and amino acid (18) content (HPLC/fluorescence) in 16 dishes specifically conceived for phenylketonuric patients, and compare the most relevant results with those of several international food composition databases. As might be expected, all the meals contained low protein levels (0.67–3.15 g/100 g) with the highest ones occurring in boiled rice and potatoes. These foods also contained the highest amounts of phenylalanine (158.51 and 62.65 mg/100 g, respectively). In contrast to the other amino acids, it was possible to predict phenylalanine content based on protein alone. Slight deviations were observed when comparing results with the different food composition databases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study deals with investigating the groundwater quality for irrigation purpose, the vulnerability of the aquifer system to pollution and also the aquifer potential for sustainable water resources development in Kobo Valley development project. The groundwater quality is evaluated up on predicting the best possible distribution of hydrogeochemicals using geostatistical method and comparing them with the water quality guidelines given for the purpose of irrigation. The hydro geochemical parameters considered are SAR, EC, TDS, Cl-, Na+, Ca++, SO4 2- and HCO3 -. The spatial variability map reveals that these parameters falls under safe, moderate and severe or increasing problems. In order to present it clearly, the aggregated Water Quality Index (WQI) map is constructed using Weighted Arithmetic Mean method. It is found that Kobo-Gerbi sub basin is suffered from bad water quality for the irrigation purpose. Waja Golesha sub-basin has moderate and Hormat Golena is the better sub basin in terms of water quality. The groundwater vulnerability assessment of the study area is made using the GOD rating system. It is found that the whole area is experiencing moderate to high risk of vulnerability and it is a good warning for proper management of the resource. The high risks of vulnerability are noticed in Hormat Golena and Waja Golesha sub basins. The aquifer potential of the study area is obtained using weighted overlay analysis and 73.3% of the total area is a good site for future water well development. The rest 26.7% of the area is not considered as a good site for spotting groundwater wells. Most of this area fall under Kobo-Gerbi sub basin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis evaluates a start-up company (Jogos Almirante Lda) whose single asset is a board game named Almirante. It aims to conclude whether it makes sense to create a company or just earn copyrights. The thesis analyzes the board game’s market, as part of the general toy’s market, from which some data exists: European countries as well as the USA. In this work it is analyzed the several ways to finance a start-up company and then present an overview of the valuation of the Jogos Almirante based on three different methods: Discounted Cash Flow, Venture Capital Method and Real Options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Accurate placement of an external ventricular drain (EVD) for the treatment of hydrocephalus is of paramount importance for its functionality and in order to minimize morbidity and complications. The aim of this study was to compare two different drain insertion assistance tools with the traditional free-hand anatomical landmark method, and to measure efficacy, safety and precision. METHODS: Ten cadaver heads were prepared by opening large bone windows centered on Kocher's points on both sides. Nineteen physicians, divided in two groups (trainees and board certified neurosurgeons) performed EVD insertions. The target for the ventricular drain tip was the ipsilateral foramen of Monro. Each participant inserted the external ventricular catheter in three different ways: 1) free-hand by anatomical landmarks, 2) neuronavigation-assisted (NN), and 3) XperCT-guided (XCT). The number of ventricular hits and dangerous trajectories; time to proceed; radiation exposure of patients and physicians; distance of the catheter tip to target and size of deviations projected in the orthogonal plans were measured and compared. RESULTS: Insertion using XCT increased the probability of ventricular puncture from 69.2 to 90.2 % (p = 0.02). Non-assisted placements were significantly less precise (catheter tip to target distance 14.3 ± 7.4 mm versus 9.6 ± 7.2 mm, p = 0.0003). The insertion time to proceed increased from 3.04 ± 2.06 min. to 7.3 ± 3.6 min. (p < 0.001). The X-ray exposure for XCT was 32.23 mSv, but could be reduced to 13.9 mSv if patients were initially imaged in the hybrid-operating suite. No supplementary radiation exposure is needed for NN if patients are imaged according to a navigation protocol initially. CONCLUSION: This ex vivo study demonstrates a significantly improved accuracy and safety using either NN or XCT-assisted methods. Therefore, efforts should be undertaken to implement these new technologies into daily clinical practice. However, the accuracy versus urgency of an EVD placement has to be balanced, as the image-guided insertion technique will implicate a longer preparation time due to a specific image acquisition and trajectory planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study was performed to assess the interlaboratory reproducibility of the molecular detection and identification of species of Zygomycetes from formalin-fixed paraffin-embedded kidney and brain tissues obtained from experimentally infected mice. Animals were infected with one of five species (Rhizopus oryzae, Rhizopus microsporus, Lichtheimia corymbifera, Rhizomucor pusillus, and Mucor circinelloides). Samples with 1, 10, or 30 slide cuts of the tissues were prepared from each paraffin block, the sample identities were blinded for analysis, and the samples were mailed to each of seven laboratories for the assessment of sensitivity. A protocol describing the extraction method and the PCR amplification procedure was provided. The internal transcribed spacer 1 (ITS1) region was amplified by PCR with the fungal universal primers ITS1 and ITS2 and sequenced. As negative results were obtained for 93% of the tissue specimens infected by M. circinelloides, the data for this species were excluded from the analysis. Positive PCR results were obtained for 93% (52/56), 89% (50/56), and 27% (15/56) of the samples with 30, 10, and 1 slide cuts, respectively. There were minor differences, depending on the organ tissue, fungal species, and laboratory. Correct species identification was possible for 100% (30 cuts), 98% (10 cuts), and 93% (1 cut) of the cases. With the protocol used in the present study, the interlaboratory reproducibility of ITS sequencing for the identification of major Zygomycetes species from formalin-fixed paraffin-embedded tissues can reach 100%, when enough material is available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present an efficient numerical scheme for the recently introduced geodesic active fields (GAF) framework for geometric image registration. This framework considers the registration task as a weighted minimal surface problem. Hence, the data-term and the regularization-term are combined through multiplication in a single, parametrization invariant and geometric cost functional. The multiplicative coupling provides an intrinsic, spatially varying and data-dependent tuning of the regularization strength, and the parametrization invariance allows working with images of nonflat geometry, generally defined on any smoothly parametrizable manifold. The resulting energy-minimizing flow, however, has poor numerical properties. Here, we provide an efficient numerical scheme that uses a splitting approach; data and regularity terms are optimized over two distinct deformation fields that are constrained to be equal via an augmented Lagrangian approach. Our approach is more flexible than standard Gaussian regularization, since one can interpolate freely between isotropic Gaussian and anisotropic TV-like smoothing. In this paper, we compare the geodesic active fields method with the popular Demons method and three more recent state-of-the-art algorithms: NL-optical flow, MRF image registration, and landmark-enhanced large displacement optical flow. Thus, we can show the advantages of the proposed FastGAF method. It compares favorably against Demons, both in terms of registration speed and quality. Over the range of example applications, it also consistently produces results not far from more dedicated state-of-the-art methods, illustrating the flexibility of the proposed framework.