969 resultados para Optimized using


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are situations in which it is very important to quickly and positively identify an individual. Examples include suspects detained in the neighborhood of a bombing or terrorist incident, individuals detained attempting to enter or leave the country, and victims of mass disasters. Systems utilized for these purposes must be fast, portable, and easy to maintain. The goal of this project was to develop an ultra fast, direct PCR method for forensic genotyping of oral swabs. The procedure developed eliminates the need for cellular digestion and extraction of the sample by performing those steps in the PCR tube itself. Then, special high-speed polymerases are added which are capable of amplifying a newly developed 7 loci multiplex in under 16 minutes. Following the amplification, a postage stamp sized microfluidic device equipped with specially designed entangled polymer separation matrix, yields a complete genotype in 80 seconds. The entire process is rapid and reliable, reducing the time from sample to genotype from 1-2 days to under 20 minutes. Operation requires minimal equipment and can be easily performed with a small high-speed thermal-cycler, reagents, and a microfluidic device with a laptop. The system was optimized and validated using a number of test parameters and a small test population. The overall precision was better than 0.17 bp and provided a power of discrimination greater than 1 in 106. The small footprint, and ease of use will permit this system to be an effective tool to quickly screen and identify individuals detained at ports of entry, police stations and remote locations. The system is robust, portable and demonstrates to the forensic community a simple solution to the problem of rapid determination of genetic identity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliability and sensitive information protection are critical aspects of integrated circuits. A novel technique using near-field evanescent wave coupling from two subwavelength gratings (SWGs), with the input laser source delivered through an optical fiber is presented for tamper evidence of electronic components. The first grating of the pair of coupled subwavelength gratings (CSWGs) was milled directly on the output facet of the silica fiber using focused ion beam (FIB) etching. The second grating was patterned using e-beam lithography and etched into a glass substrate using reactive ion etching (RIE). The slightest intrusion attempt would separate the CSWGs and eliminate near-field coupling between the gratings. Tampering, therefore, would become evident. Computer simulations guided the design for optimal operation of the security solution. The physical dimensions of the SWGs, i.e. period and thickness, were optimized, for a 650 nm illuminating wavelength. The optimal dimensions resulted in a 560 nm grating period for the first grating etched in the silica optical fiber and 420 nm for the second grating etched in borosilicate glass. The incident light beam had a half-width at half-maximum (HWHM) of at least 7 µm to allow discernible higher transmission orders, and a HWHM of 28 µm for minimum noise. The minimum number of individual grating lines present on the optical fiber facet was identified as 15 lines. Grating rotation due to the cylindrical geometry of the fiber resulted in a rotation of the far-field pattern, corresponding to the rotation angle of moiré fringes. With the goal of later adding authentication to tamper evidence, the concept of CSWGs signature was also modeled by introducing random and planned variations in the glass grating. The fiber was placed on a stage supported by a nanomanipulator, which permitted three-dimensional displacement while maintaining the fiber tip normal to the surface of the glass substrate. A 650 nm diode laser was fixed to a translation mount that transmitted the light source through the optical fiber, and the output intensity was measured using a silicon photodiode. The evanescent wave coupling output results for the CSWGs were measured and compared to the simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current commercially available mimics contain varying amounts of either the actual explosive/drug or the chemical compound of suspected interest by biological detectors. As a result, there is significant interest in determining the dominant chemical odor signatures of the mimics, often referred to as pseudos, particularly when compared to the genuine contraband material. This dissertation discusses results obtained from the analysis of drug and explosive headspace related to the odor profiles as recognized by trained detection canines. Analysis was performed through the use of headspace solid phase microextraction in conjunction with gas chromatography mass spectrometry (HS-SPME-GC-MS). Upon determination of specific odors, field trials were held using a combination of the target odors with COMPS. Piperonal was shown to be a dominant odor compound in the headspace of some ecstasy samples and a recognizable odor mimic by trained detection canines. It was also shown that detection canines could be imprinted on piperonal COMPS and correctly identify ecstasy samples at a threshold level of approximately 100ng/s. Isosafrole and/or MDP-2-POH show potential as training aid mimics for non-piperonal based MDMA. Acetic acid was shown to be dominant in the headspace of heroin samples and verified as a dominant odor in commercial vinegar samples; however, no common, secondary compound was detected in the headspace of either. Because of the similarities detected within respective explosive classes, several compounds were chosen for explosive mimics. A single based smokeless powder with a detectable level of 2,4-dinitrotoluene, a double based smokeless powder with a detectable level of nitroglycerine, 2-ethyl-1-hexanol, DMNB, ethyl centralite and diphenylamine were shown to be accurate mimics for TNT-based explosives, NG-based explosives, plastic explosives, tagged explosives, and smokeless powders, respectively. The combination of these six odors represents a comprehensive explosive odor kit with positive results for imprint on detection canines. As a proof of concept, the chemical compound PFTBA showed promise as a possible universal, non-target odor compound for comparison and calibration of detection canines and instrumentation. In a comparison study of shape versus vibration odor theory, the detection of d-methyl benzoate and methyl benzoate was explored using canine detectors. While results did not overwhelmingly substantiate either theory, shape odor theory provides a better explanation of the canine and human subject responses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of high phase noise in addition to additive white Gaussian noise in coherent optical systems affects the performance of forward error correction (FEC) schemes. In this paper, we propose a simple scheme for such systems, using block interleavers and binary Bose–Chaudhuri–Hocquenghem (BCH) codes. The block interleavers are specifically optimized for differential quadrature phase shift keying modulation. We propose a method for selecting BCH codes that, together with the interleavers, achieve a target post-FEC bit error rate (BER). This combination of interleavers and BCH codes has very low implementation complexity. In addition, our approach is straightforward, requiring only short pre-FEC simulations to parameterize a model, based on which we select codes analytically. We aim to correct a pre-FEC BER of around (Formula presented.). We evaluate the accuracy of our approach using numerical simulations. For a target post-FEC BER of (Formula presented.), codes selected using our method result in BERs around 3(Formula presented.) target and achieve the target with around 0.2 dB extra signal-to-noise ratio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the results of a study into the factors controlling the quality of nanolithographic imaging. Self-assembled monolayer (SAM) coverage, subsequent postetch pattern definition, and minimum feature size all depend on the quality of the Au substrate used in material mask atomic nanolithographic experiments. We find that sputtered Au substrates yield much smoother surfaces and a higher density of {111}-oriented grains than evaporated Au surfaces. Phase imaging with an atomic force microscope shows that the quality and percentage coverage of SAM adsorption are much greater for sputtered Au surfaces. Exposure of the self-assembled monolayer to an optically cooled atomic Cs beam traversing a two-dimensional array of submicron material masks mounted a few microns above the self-assembled monolayer surface allowed determination of the minimum average Cs dose (2 Cs atoms per self-assembled monolayer molecule) to write the monolayer. Suitable wet etching, with etch rates of 2.2 nm min-1, results in optimized pattern definition. Utilizing these optimizations, material mask features as small as 230 nm in diameter with a fractional depth gradient of 0.820 nm were realized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiotherapy is commonly used to treat lung cancer. However, radiation induced damage to lung tissue is a major limiting factor to its use. To minimize normal tissue lung toxicity from conformal radiotherapy treatment planning, we investigated the use of Perfluoropropane(PFP)-enhanced MR imaging to assess and guide the sparing of functioning lung. Fluorine Enhanced MRI using Perfluoropropane(PFP) is a dynamic multi-breath steady state technique enabling quantitative and qualitative assessments of lung function(1).

Imaging data was obtained from studies previously acquired in the Duke Image Analysis Laboratory. All studies were approved by the Duke IRB. The data was de-identified for this project, which was also approved by the Duke IRB. Subjects performed several breath-holds at total lung capacity(TLC) interspersed with multiple tidal breaths(TB) of Perfluoropropane(PFP)/oxygen mixture. Additive wash-in intensity images were created through the summation of the wash-in phase breath-holds. Additionally, model based fitting was utilized to create parametric images of lung function(1).

Varian Eclipse treatment planning software was used for putative treatment planning. For each subject two plans were made, a standard plan, with no regional functional lung information considered other than current standard models. Another was created using functional information to spare functional lung while maintaining dose to the target lesion. Plans were optimized to a prescription dose of 60 Gy to the target over the course of 30 fractions.

A decrease in dose to functioning lung was observed when utilizing this functional information compared to the standard plan for all five subjects. PFP-enhanced MR imaging is a feasible method to assess ventilatory lung function and we have shown how this can be incorporated into treatment planning to potentially decrease the dose to normal tissue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adjoint methods have proven to be an efficient way of calculating the gradient of an objective function with respect to a shape parameter for optimisation, with a computational cost nearly independent of the number of the design variables [1]. The approach in this paper links the adjoint surface sensitivities (gradient of objective function with respect to the surface movement) with the parametric design velocities (movement of the surface due to a CAD parameter perturbation) in order to compute the gradient of the objective function with respect to CAD variables.
For a successful implementation of shape optimization strategies in practical industrial cases, the choice of design variables or parameterisation scheme used for the model to be optimized plays a vital role. Where the goal is to base the optimization on a CAD model the choices are to use a NURBS geometry generated from CAD modelling software, where the position of the NURBS control points are the optimisation variables [2] or to use the feature based CAD model with all of the construction history to preserve the design intent [3]. The main advantage of using the feature based model is that the optimized model produced can be directly used for the downstream applications including manufacturing and process planning.
This paper presents an approach for optimization based on the feature based CAD model, which uses CAD parameters defining the features in the model geometry as the design variables. In order to capture the CAD surface movement with respect to the change in design variable, the “Parametric Design Velocity” is calculated, which is defined as the movement of the CAD model boundary in the normal direction due to a change in the parameter value.
The approach presented here for calculating the design velocities represents an advancement in terms of capability and robustness of that described by Robinson et al. [3]. The process can be easily integrated to most industrial optimisation workflows and is immune to the topology and labelling issues highlighted by other CAD based optimisation processes. It considers every continuous (“real value”) parameter type as an optimisation variable, and it can be adapted to work with any CAD modelling software, as long as it has an API which provides access to the values of the parameters which control the model shape and allows the model geometry to be exported. To calculate the movement of the boundary the methodology employs finite differences on the shape of the 3D CAD models before and after the parameter perturbation. The implementation procedure includes calculating the geometrical movement along a normal direction between two discrete representations of the original and perturbed geometry respectively. Parametric design velocities can then be directly linked with adjoint surface sensitivities to extract the gradients to use in a gradient-based optimization algorithm.
The optimisation of a flow optimisation problem is presented, in which the power dissipation of the flow in an automotive air duct is to be reduced by changing the parameters of the CAD geometry created in CATIA V5. The flow sensitivities are computed with the continuous adjoint method for a laminar and turbulent flow [4] and are combined with the parametric design velocities to compute the cost function gradients. A line-search algorithm is then used to update the design variables and proceed further with optimisation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doutoramento em Engenharia dos Biossistemas - Instituto Superior de Agronomia - UL

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To develop and optimise some variables that influence fluoxetine orally disintegrating tablets (ODTs) formulation. Methods: Fluoxetine ODTs tablets were prepared using direct compression method. Three-factor, 3- level Box-Behnken design was used to optimize and develop fluoxetine ODT formulation. The design suggested 15 formulations of different lubricant concentration (X1), lubricant mixing time (X2), and compression force (X3) and then their effect was monitored on tablet weight (Y1), thickness (Y2), hardness (Y3), % friability (Y4), and disintegration time (Y5). Results: All powder blends showed acceptable flow properties, ranging from good to excellent. The disintegration time (Y5) was affected directly by lubricant concentration (X1). Lubricant mixing time (X2) had a direct effect on tablet thickness (Y2) and hardness (Y3), while compression force (X3) had a direct impact on tablet hardness (Y3), % friability (Y4) and disintegration time (Y5). Accordingly, Box-Behnken design suggested an optimized formula of 0.86 mg (X1), 15.3 min (X2), and 10.6 KN (X3). Finally, the prediction error percentage responses of Y1, Y2, Y3, Y4, and Y5 were 0.31, 0.52, 2.13, 3.92 and 3.75 %, respectively. Formula 4 and 8 achieved 90 % of drug release within the first 5 min of dissolution test. Conclusion: Fluoxetine ODT formulation has been developed and optimized successfully using Box- Behnken design and has also been manufactured efficiently using direct compression technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the absence of effective vaccine(s), control of African swine fever caused by African swine fever virus (ASFV) must be based on early, efficient, cost-effective detection and strict control and elimination strategies. For this purpose, we developed an indirect ELISA capable of detecting ASFV antibodies in either serum or oral fluid specimens. The recombinant protein used in the ELISA was selected by comparing the early serum antibody response of ASFV-infected pigs (NHV-p68 isolate) to three major recombinant polypeptides (p30, p54, p72) using a multiplex fluorescent microbead-based immunoassay (FMIA). Non-hazardous (non-infectious) antibody-positive serum for use as plate positive controls and for the calculation of sample-to-positive (S:P) ratios was produced by inoculating pigs with a replicon particle (RP) vaccine expressing the ASFV p30 gene. The optimized ELISA detected anti-p30 antibodies in serum and/or oral fluid samples from pigs inoculated with ASFV under experimental conditions beginning 8 to 12 days post inoculation. Tests on serum (n = 200) and oral fluid (n = 200) field samples from an ASFV-free population demonstrated that the assay was highly diagnostically specific. The convenience and diagnostic utility of oral fluid sampling combined with the flexibility to test either serum or oral fluid on the same platform suggests that this assay will be highly useful under the conditions for which OIE recommends ASFV antibody surveillance, i.e., in ASFV-endemic areas and for the detection of infections with ASFV isolates of low virulence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The isoprene degradation mechanism included in version 3 of the Master Chemical Mechanism (MCM v3) has been evaluated and refined, using the Statewide Air Pollution Research Center (SAPRC) environmental chamber datasets on the photo-oxidation of isoprene and its degradation products, methacrolein (MACR) and methylvinyl ketone (MVK). Prior to this, the MCM v3 butane degradation chemistry was also evaluated using chamber data on the photo-oxidation of butane, and its degradation products, methylethyl ketone (MEK), acetaldehyde (CH3CHO) and formaldehyde (HCHO), in conjunction with an initial evaluation of the chamber-dependent auxiliary mechanisms for the series of relevant chambers. The MCM v3 mechanisms for both isoprene and butane generally performed well and were found to provide an acceptable reaction framework for describing the NOx-photo-oxidation experiments on the above systems, although a number of parameter modifications and refinements were identified which resulted in an improved performance. All these relate to the magnitude of sources of free radicals from organic chemical process, such as carbonyl photolysis rates and the yields of radicals from the reactions of O3 with unsaturated oxygenates, and specific recommendations are made for refinements. In addition to this, it was necessary to include a representation of the reactions of O(3P) with isoprene, MACR and MVK (which were not previously treated in MCM v3), and conclusions are drawn concerning the required extent of free radical formation from these reactions. Throughout the study, the performance of MCM v3 was also compared with that of the SAPRC-99 mechanism, which was developed and optimized in conjunction with the chamber datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The representation of alkene degradation in version 3 of the Master Chemical Mechanism (MCM v3) has been evaluated, using environmental chamber data on the photo-oxidation of ethene, propene, 1-butene and 1-hexene in the presence of NOx, from up to five chambers at the Statewide Air Pollution Research Center (SAPRC) at the University of California. As part of this evaluation, it was necessary to include a representation of the reactions of the alkenes with O(3P), which are significant under chamber conditions but generally insignificant under atmospheric conditions. The simulations for the ethene and propene systems, in particular, were found to be sensitive to the branching ratios assigned to molecular and free radical forming pathways of the O(3P) reactions, with the extent of radical formation required for proper fitting of the model to the chamber data being substantially lower than the reported consensus. With this constraint, the MCM v3 mechanisms for ethene and propene generally performed well. The sensitivity of the simulations to the parameters applied to a series of other radical sources and sink reactions (radical formation from the alkene ozonolysis reactions and product carbonyl photolysis; radical removal from the reaction of OH with NO2 and β-hydroxynitrate formation) were also considered, and the implications of these results are discussed. Evaluation of the MCM v3 1-butene and 1-hexene degradation mechanisms, using a more limited dataset from only one chamber, was found to be inconclusive. The results of sensitivity studies demonstrate that it is impossible to reconcile the simulated and observed formation of ozone in these systems for ranges of parameter values which can currently be justified on the basis of the literature. As a result of this work, gaps and uncertainties in the kinetic, mechanistic and chamber database are identified and discussed, in relation to both tropospheric chemistry and chemistry important under chamber conditions which may compromise the evaluation procedure, and recommendations are made for future experimental studies. Throughout the study, the performance of the MCM v3 chemistry was also simultaneously compared with that of the corresponding chemistry in the SAPRC-99 mechanism, which was developed and optimized in conjunction with the chamber datasets.