26 resultados para drawbacks
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
ate studies(2) and fusion energy research(3,4). Laser-driven implosions of spherical polymer shells have, for example, achieved an increase in density of 1,000 times relative to the solid state(5). These densities are large enough to enable controlled fusion, but to achieve energy gain a small volume of compressed fuel (known as the 'spark') must be heated to temperatures of about 10(8) K (corresponding to thermal energies in excess of 10 keV). In the conventional approach to controlled fusion, the spark is both produced and heated by accurately timed shock waves(4), but this process requires both precise implosion symmetry and a very large drive energy. In principle, these requirements can be significantly relaxed by performing the compression and fast heating separately(6-10); however, this 'fast ignitor' approach(7) also suffers drawbacks, such as propagation losses and deflection of the ultra-intense laser pulse by the plasma surrounding the compressed fuel. Here we employ a new compression geometry that eliminates these problems; we combine production of compressed matter in a laser-driven implosion with picosecond-fast heating by a laser pulse timed to coincide with the peak compression. Our approach therefore permits efficient compression and heating to be carried out simultaneously, providing a route to efficient fusion energy production.
Resumo:
We present a technique for simultaneous focusing and energy selection of high-current, mega-electron volt proton beams With the use of radial, transient electric fields (107 to 1010 volts per meter) triggered on the inner walls of a hollow microcylinder by an intense subpicosecond laser pulse. Because of the transient nature of the focusing fields, the proposed method allows selection of a desired range out of the spectrum of the polyenergetic proton beam. This technique addresses current drawbacks of laser-accelerated proton beams, such as their broad spectrum and divergence at the source.
Resumo:
This paper considers invariant texture analysis. Texture analysis approaches whose performances are not affected by translation, rotation, affine, and perspective transform are addressed. Existing invariant texture analysis algorithms are carefully studied and classified into three categories: statistical methods, model based methods, and structural methods. The importance of invariant texture analysis is presented first. Each approach is reviewed according to its classification, and its merits and drawbacks are outlined. The focus of possible future work is also suggested.
Resumo:
Chitosan nanoparticles fabricated via different preparation protocols have been in recent years widely studied as carriers for therapeutic proteins and genes with varying degree of effectiveness and drawbacks. This work seeks to further explore the polyionic coacervation fabrication process, and associated processing conditions under which protein encapsulation and subsequent release can be systematically and predictably manipulated so as to obtain desired effectiveness. BSA was used as a model protein which was encapsulated by either incorporation or incubation method, using the polyanion tripolyphosphate (TPP) as the coacervation crosslink agent to form chitosan-BSA-TPP nanoparticles. The BSA-loaded chitosan-TPP nanoparticles were characterized for particle size, morphology, zeta potential, BSA encapsulation efficiency, and subsequent release kinetics, which were found predominantly dependent on the factors of chitosan molecular weight, chitosan concentration, BSA loading concentration, and chitosan/TPP mass ratio. The BSA loaded nanoparticles prepared under varying conditions were in the size range of 200-580 nm, and exhibit a high positive zeta potential. Detailed sequential time frame TEM imaging of morphological change of the BSA loaded particles showed a swelling and particle degradation process. Initial burst released due to surface protein desorption and diffusion from sublayers did not relate directly to change of particle size and shape, which was eminently apparent only after 6 h. It is also notable that later stage particle degradation and disintegration did not yield a substantial follow-on release, as the remaining protein molecules, with adaptable 3-D conformation, could be tightly bound and entangled with the cationic chitosan chains. In general, this study demonstrated that the polyionic coacervation process for fabricating protein loaded chitosan nanoparticles offers simple preparation conditions and a clear processing window for manipulation of physiochemical properties of the nanoparticles (e.g., size and surface charge), which can be conditioned to exert control over protein encapsulation efficiency and subsequent release profile. The weakness of the chitosan nanoparticle system lies typically with difficulties in controlling initial burst effect in releasing large quantities of protein molecules. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Background: Results from clinical trials are usually summarized in the form of sampling distributions. When full information (mean, SEM) about these distributions is given, performing meta-analysis is straightforward. However, when some of the sampling distributions only have mean values, a challenging issue is to decide how to use such distributions in meta-analysis. Currently, the most common approaches are either ignoring such trials or for each trial with a missing SEM, finding a similar trial and taking its SEM value as the missing SEM. Both approaches have drawbacks. As an alternative, this paper develops and tests two new methods, the first being the prognostic method and the second being the interval method, to estimate any missing SEMs from a set of sampling distributions with full information. A merging method is also proposed to handle clinical trials with partial information to simulate meta-analysis.
Resumo:
Photodynamic therapy (PDT) and photodynamic antimicrobial chemotherapy (PACT) are techniques that combine the effects of visible light irradiation with subsequent biochemical events that arise from the presence of a photosensitizing drug (possessing no dark toxicity) to cause destruction of selected cells. Despite its still widespread clinical use, Photofrin (R) has several drawbacks that limit its general clinical use. Consequently, there has been extensive research into the design of improved alternative photosensitizers aimed at overcoming these drawbacks. While there are many review articles on the subject of PDT and PACT, these have focused on the photosensitizers that have been used clinically, with little emphasis placed on how the chemical aspects of the molecule can affect their efficacy as PDT agents. Indeed, many of the PDT/PACT agents used clinically may not even be the most appropriate within a given class. As such, this review aims to provide a better understanding of the factors that have been investigated, while aiming at improving the efficacy of a molecule intended to be used as a photosensitizer. Recent publications, spanning the last 5 years, concerning the design, synthesis and clinical usage of photosensitizers for application in PDT and PACT are reviewed, including 5-aminolevulinic acid, porphyrins, chlorins, bacteriochlorins, texaphyrins, phthalocyanines and porphycenes. It has been shown that there are many important considerations when designing a potential PDT/PACT agent, including the influence of added groups on the lipophilicity of the molecule, the positioning and nature of these added groups within the molecule, the presence of a central metal ion and the number of charges that the molecule possesses. The extensive ongoing research within the field has led to the identification of a number of potential lead molecules for application in PDT/PACT. The development of the second-generation photosensitizers, possessing shorter periods of photosensitization, longer activation wavelengths and greater selectivity for diseased tissue provides hope for attaining the ideal photosensitizer that may help PDT and PACT move from laboratory investigation to clinical practice.
Resumo:
The objective of this paper is to describe and evaluate the application of the Text Encoding Initiative (TEI) Guidelines to a corpus of oral French, this being the first corpus of oral French where the TEI has been used. The paper explains the purpose of the corpus, both in creating a specialist corpus of néo-contage that will broaden the range of oral corpora available, and, more importantly, in creating a dataset to explore a variety of oral French that has a particularly interesting status in terms of factors such as conception orale/écrite, réalisation médiale and comportement communicatif (Koch and Oesterreicher 2001). The linguistic phenomena to be encoded are both stylistic (speech and thought presentation) and syntactic (negation, detachment, inversion), and all represent areas where previous research has highlighted the significance of factors such as medium, register and discourse type, as well as a host of linguistic factors (syntactic, phonetic, lexical). After a discussion of how a tagset can be designed and applied within the TEI to encode speech and thought presentation, negation, detachment and inversion, the final section of the paper evaluates the benefits and possible drawbacks of the methodology offered by the TEI when applied to a syntactic and stylistic markup of an oral corpus.
Resumo:
This case study examines how the lean ideas behind the Toyota production system can be applied to software project management. It is a detailed investigation of the performance of a nine-person software development team employed by BBC Worldwide based in London. The data collected in 2009 involved direct observations of the development team, the kanban boards, the daily stand-up meetings, semistructured interviews with a wide variety of staff, and statistical analysis. The evidence shows that over the 12-month period, lead time to deliver software improved by 37%, consistency of delivery rose by 47%, and defects reported by customers fell 24%. The significance of this work is showing that the use of lean methods including visual management, team-based problem solving, smaller batch sizes, and statistical process control can improve software development. It also summarizes key differences between agile and lean approaches to software development. The conclusion is that the performance of the software development team was improved by adopting a lean approach. The faster delivery with a focus on creating the highest value to the customer also reduced both technical and market risks. The drawbacks are that it may not fit well with existing corporate standards.
Resumo:
A UV indicator/dosimeter based on benzyl viologen (BV2+) encapsulated in polyvinyl alcohol (PVA) is described. Upon exposure to UV light, the BV2+/PVA film turns a striking purple colour due to the formation of the cation radical, BV center dot+. The usual oxygen sensitivity of BV center dot+ is significantly reduced due to the very low oxygen permeability of the encapsulating polymer, PVA. Exposure of a typical BV2+/PVA film, for a set amount of time, to UVB light with different UV indices produces different levels of BV center dot+, as measured by the absorbance of the film at 550 nm. A plot of the change in absorbance at this wavelength, Delta Abs(550), as a function of UV index, UVI, produces a linear calibration curve which allows the film to be used as a UVB indicator, and a similar procedure could be employed to allow it to be used as a solar UVI indicator. A typical BV2+/PVA film generates a significant, semi-permanent (stable for > 24 h) saturated purple colour (absorbance similar to 0.8-0.9) upon exposure to sunlight equivalent to a minimal erythemal dose associated with Caucasian skin, i.e. skin type II. The current drawbacks of the film and the possible future use of the BV2+/PVA film as a personal solar UV dosimeter for all skin types are briefly discussed.
Resumo:
Background: This article describes a 'back to the future' approach to case 'write-ups', with medical students producing handwritten instead of word-processed case reports during their clinical placements. Word-processed reports had been found to have a number of drawbacks, including the inappropriate use of 'cutting and pasting', undue length and lack of focus. Method: We developed a template to be completed by hand, based on the hospital 'clerking-in process', and matched this to a new assessment proforma. An electronic survey was conducted of both students and assessors after the first year of operation to evaluate impact and utility. Results: The new template was well received by both students and assessors. Most students said they preferred handwriting the case reports (55.6%), although a significant proportion (44.4%) preferred the word processor. Many commented that the template enabled them to effectively learn the structure of a case history and to improve their history-taking skills. Most assessors who had previously marked case reports felt the new system represented an improvement. The average time spent marking each report fell from 23.56 to 16.38minutes using the new proforma. Discussion: Free text comments from the survey have led to the development of a more flexible case report template better suited to certain specialties (e.g. dermatology). This is an evolving process and there will be opportunities for further adaptation as electronic medical records become more common in hospital. © Blackwell Publishing Ltd 2012.
Resumo:
The Richardson-Lucy algorithm is one of the most important algorithms in the image deconvolution area. However, one of its drawbacks is slow convergence. A very significant acceleration is obtained by the technique proposed by Biggs and Andrews (BA), which is implemented in the deconvlucy function of the Image Processing MATLAB toolbox. The BA method was developed heuristically with no proof of convergence. In this paper, we introduce the Heavy-Ball (H-B) method for Poisson data optimization and extend it to a scaled H-B method, which includes the BA method as a special case. The method has proof of the convergence rate of O(k-2), where k is the number of iterations. We demonstrate the superior convergence performance of the scaled H-B method on both synthetic and real 3D images.
Resumo:
Preparing social work students for the demands of changing social environments and to promote student mobility and interest in overseas employment opportunities have resulted in an increasing demand for international social work placements. The literature describes numerous examples of social work programmes that offer a wide variety of international placements. However, research about the actual benefit of undertaking an overseas placement is scant with limited empirical evidence on the profile of students participating, their experience of the tasks offered, the supervisory practice and the outcomes for students' professional learning and career. This study contributes to the existing body of literature by exploring the relevance of international field placements for students and is unique in that it draws its sample from students who have graduated so provides a distinctive perspective in which to compare their international placement with their other placement/s as well as evaluating what were the benefits and drawbacks for them in terms of their careers, employment opportunities and current professional practice.
Resumo:
Classification methods with embedded feature selection capability are very appealing for the analysis of complex processes since they allow the analysis of root causes even when the number of input variables is high. In this work, we investigate the performance of three techniques for classification within a Monte Carlo strategy with the aim of root cause analysis. We consider the naive bayes classifier and the logistic regression model with two different implementations for controlling model complexity, namely, a LASSO-like implementation with a L1 norm regularization and a fully Bayesian implementation of the logistic model, the so called relevance vector machine. Several challenges can arise when estimating such models mainly linked to the characteristics of the data: a large number of input variables, high correlation among subsets of variables, the situation where the number of variables is higher than the number of available data points and the case of unbalanced datasets. Using an ecological and a semiconductor manufacturing dataset, we show advantages and drawbacks of each method, highlighting the superior performance in term of classification accuracy for the relevance vector machine with respect to the other classifiers. Moreover, we show how the combination of the proposed techniques and the Monte Carlo approach can be used to get more robust insights into the problem under analysis when faced with challenging modelling conditions.
Resumo:
Punctal plugs (PPs) are miniature medical implants that were initially developed for the treatment of dry eyes. Since their introduction in 1975, many PPs made from different materials and designs have been developed. PPs, albeit generally successful, suffer from drawbacks such as epiphora and suppurative canaliculitis. To overcome these issues intelligent designs of PPs were proposed (e.g. SmartPLUG™ and Form Fit™). PPs are also gaining interest among pharmaceutical scientists for sustaining drug delivery to the eye. This review aims to provide an overview of PPs for dry eye treatment and drug delivery to treat a range of ocular diseases. It also discusses current challenges in using PPs for ocular diseases.
Resumo:
This paper addresses the estimation of parameters of a Bayesian network from incomplete data. The task is usually tackled by running the Expectation-Maximization (EM) algorithm several times in order to obtain a high log-likelihood estimate. We argue that choosing the maximum log-likelihood estimate (as well as the maximum penalized log-likelihood and the maximum a posteriori estimate) has severe drawbacks, being affected both by overfitting and model uncertainty. Two ideas are discussed to overcome these issues: a maximum entropy approach and a Bayesian model averaging approach. Both ideas can be easily applied on top of EM, while the entropy idea can be also implemented in a more sophisticated way, through a dedicated non-linear solver. A vast set of experiments shows that these ideas produce significantly better estimates and inferences than the traditional and widely used maximum (penalized) log-likelihood and maximum a posteriori estimates. In particular, if EM is adopted as optimization engine, the model averaging approach is the best performing one; its performance is matched by the entropy approach when implemented using the non-linear solver. The results suggest that the applicability of these ideas is immediate (they are easy to implement and to integrate in currently available inference engines) and that they constitute a better way to learn Bayesian network parameters.