854 resultados para based inspection and conditional monitoring


Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All A’s was designed to support of the agency’s family strengthening initiatives in South Florida. All A’s uses evidence informed strategies poised to be an inclusive curriculum that teaches self-determination and adaptive behavior skills. The framework incorporates problem based learning and adult learning theory and follows the Universal Design for Learning. Since 2012, the agency has served over 8500 youth and 4,000 adults using the framework. The framework addresses educational underachievement and career readiness in at risk populations. It is used to enhance participants AWARENESS of setting SMART goals to achieve future goals and career aspirations. Participants are provided with ACCESS to resources and opportunities for creating and implementing an ACTION plan as they pursue and ACHIEVE their goals. All A’s promotes protective factors and expose youth to career pathways in Science, Technology, Engineering and Math (STEM) related fields. Youth participate in college tours, job site visits, job shadowing, high school visits, online college and career preparation assistance, service learning projects, STEM projects, and the Winning Futures© mentoring program. Adults are assisted with résumé development; learn job search strategies, interview techniques, job shadowing experiences, computer and financial literacy programs. Adults and youth are also given the opportunity to complete industry-recognized certifications in high demand industries (food service, general labor, and construction), and test preparation for the General Educational Development Test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Harmful algal blooms (HABs) are becoming more frequent as climate changes, with tropical species moving northward. Monitoring programs detecting the presence of toxic algae before they bloom are of paramount importance to protect aquatic ecosystems, aquaculture, human health and local economies. Rapid and reliable species identification methods using molecular barcodes coupled to biosensor detection tools have received increasing attention over the past decade as an alternative to the impractical standard microscopic counting-based techniques. This work reports on a PCR amplification-free electrochemical genosensor for the enhanced selective and sensitive detection of RNA from multiple Mediterranean toxic algal species. For a sandwich hybridization (SHA), we designed longer capture and signal probes for more specific target discrimination against a single base-pair mismatch from closely related species and for reproducible signals. We optimized experimental conditions, viz., minimal probe concentration in the SHA on a screen-printed gold electrode and selected the best electrochemical mediator. Probes from 13 Mediterranean dinoflagellate species were tested under optimized conditions and the format further tested for quantification of RNA from environmental samples. We not only enhanced the selectivity and sensitivity of the state-of-the-art toxic algal genosensors but also increased the repertoire of toxic algal biosensors in the Mediterranean, towards an integral and automatic monitoring system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Harmful algal blooms (HABs) are becoming more frequent as climate changes, with tropical species moving northward. Monitoring programs detecting the presence of toxic algae before they bloom are of paramount importance to protect aquatic ecosystems, aquaculture, human health and local economies. Rapid and reliable species identification methods using molecular barcodes coupled to biosensor detection tools have received increasing attention over the past decade as an alternative to the impractical standard microscopic counting-based techniques. This work reports on a PCR amplification-free electrochemical genosensor for the enhanced selective and sensitive detection of RNA from multiple Mediterranean toxic algal species. For a sandwich hybridization (SHA), we designed longer capture and signal probes for more specific target discrimination against a single base-pair mismatch from closely related species and for reproducible signals. We optimized experimental conditions, viz., minimal probe concentration in the SHA on a screen-printed gold electrode and selected the best electrochemical mediator. Probes from 13 Mediterranean dinoflagellate species were tested under optimized conditions and the format further tested for quantification of RNA from environmental samples. We not only enhanced the selectivity and sensitivity of the state-of-the-art toxic algal genosensors but also increased the repertoire of toxic algal biosensors in the Mediterranean, towards an integral and automatic monitoring system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. The purpose of this study was to develop and evaluate a computer-based, dietary, and physical activity self-management program for people recently diagnosed with type 2 diabetes. 
Methods. The computer-based program was developed in conjunction with the target group and evaluated in a 12-week randomised controlled trial (RCT). Participants were randomised to the intervention (computer-program) or control group (usual care). Primary outcomes were diabetes knowledge and goal setting (ADKnowl questionnaire, Diabetes Obstacles Questionnaire (DOQ)) measured at baseline and week 12. User feedback on the program was obtained via a questionnaire and focus groups. Results. Seventy participants completed the 12-week RCT (32 intervention, 38 control, mean age 59 (SD) years). After completion there was a significant between-group difference in the “knowledge and beliefs scale” of the DOQ. Two-thirds of the intervention group rated the program as either good or very good, 92% would recommend the program to others, and 96% agreed that the information within the program was clear and easy to understand
Conclusions. The computer-program resulted in a small but statistically significant improvement in diet-related knowledge and user satisfaction was high. With some further development, this computer-based educational tool may be a useful adjunct to diabetes self-management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nature-based solutions promoting green and blue urban areas have significant potential to decrease the vulnerability and enhance the resilience of cities in light of climatic change. They can thereby help to mitigate climate change-induced impacts and serve as proactive adaptation options for municipalities. We explore the various contexts in which nature-based solutions are relevant for climate mitigation and adaptation in urban areas, identify indicators for assessing the effectiveness of nature-based solutions and related knowledge gaps. In addition, we explore existing barriers and potential opportunities for increasing the scale and effectiveness of nature-based solution implementation. The results were derived from an inter- and transdisciplinary workshop with experts from research, municipalities, policy, and society. As an outcome of the workshop discussions and building on existing evidence, we highlight three main needs for future science and policy agendas when dealing with nature-based solutions: (i) produce stronger evidence on nature-based solutions for climate change adaptation and mitigation and raise awareness by increasing implementation; (ii) adapt for governance challenges in implementing nature-based solutions by using reflexive approaches, which implies bringing together new networks of society, nature-based solution ambassadors, and practitioners; (iii) consider socio-environmental justice and social cohesion when implementing nature-based solutions by using integrated governance approaches that take into account an integrative and transdisciplinary participation of diverse actors. Taking these needs into account, nature-based solutions can serve as climate mitigation and adaptation tools that produce additional cobenefits for societal well-being, thereby serving as strong investment options for sustainable urban planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: The European Commission Cooperation in Science and Technology (COST) Action FA1203 “SMARTER” aims to make recommendations for the sustainable management of Ambrosia across Europe and for monitoring its efficiency and cost effectiveness. The goal of the present study is to provide a baseline for spatial and temporal variations in airborne Ambrosia pollen in Europe that can be used for the management and evaluation of this noxious plant . Location: The full range of Ambrosia artemisiifolia L. distribution over Europe (39oN-60oN; 2oW-45oE). Methods: Airborne Ambrosia pollen data for the principal flowering period of Ambrosia (August-September) recorded during a 10-year period (2004-2013) were obtained from 242 monitoring sites. The mean sum of daily average airborne Ambrosia pollen and the number of days that Ambrosia pollen was recorded in the air were analysed. The mean and Standard Deviation (SD) were calculated regardless of the number of years included in the study period, while trends are based on those time series with 8 or more years of data. Trends were considered significant at p < 0.05. Results: There were few significant trends in the magnitude and frequency of atmospheric Ambrosia pollen (only 8% for the mean sum of daily average Ambrosia pollen concentrations and 14% for the mean number of days Ambrosia pollen was recorded in the air). Main conclusions: The direction of any trends varied locally and reflect changes in sources of the pollen, either in size or in distance from the monitoring station. Pollen monitoring is important for providing an early warning of the expansion of this invasive and noxious plant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Graphene has emerged as an extraordinary material with its capability to accommodate an array of remarkable electronic, mechanical and chemical properties. Extra-large surface-to-volume ratio renders graphene a highly flexible morphology, giving rise to intriguing observations such as ripples, wrinkles and folds as well as the potential to transform into other novel carbon nanostructures. Ultra-thin, mechanically tough, electrically conductive graphene films promise to enable a wealth of possible applications ranging from hydrogen storage scaffolds, electronic transistors, to bottom-up material designs. Enthusiasm for graphene-based applications aside, there are still significant challenges to their realization, largely due to the difficulty of precisely controlling the graphene properties. Controlling the graphene morphology over large areas is crucial in enabling future graphene-based applications and material design. This dissertation aims to shed lights on potential mechanisms to actively manipulate the graphene morphology and properties and therefore enable the material design principle that delivers desirable mechanical and electronic functionalities of graphene and its derivatives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New methods of nuclear fuel and cladding characterization must be developed and implemented to enhance the safety and reliability of nuclear power plants. One class of such advanced methods is aimed at the characterization of fuel performance by performing minimally intrusive in-core, real time measurements on nuclear fuel on the nanometer scale. Nuclear power plants depend on instrumentation and control systems for monitoring, control and protection. Traditionally, methods for fuel characterization under irradiation are performed using a “cook and look” method. These methods are very expensive and labor-intensive since they require removal, inspection and return of irradiated samples for each measurement. Such fuel cladding inspection methods investigate oxide layer thickness, wear, dimensional changes, ovality, nuclear fuel growth and nuclear fuel defect identification. These methods are also not suitable for all commercial nuclear power applications as they are not always available to the operator when needed. Additionally, such techniques often provide limited data and may exacerbate the phenomena being investigated. This thesis investigates a novel, nanostructured sensor based on a photonic crystal design that is implemented in a nuclear reactor environment. The aim of this work is to produce an in-situ radiation-tolerant sensor capable of measuring the deformation of a nuclear material during nuclear reactor operations. The sensor was fabricated on the surface of nuclear reactor materials (specifically, steel and zirconium based alloys). Charged-particle and mixed-field irradiations were both performed on a newly-developed “pelletron” beamline at Idaho State University's Research and Innovation in Science and Engineering (RISE) complex and at the University of Maryland's 250 kW Training Reactor (MUTR). The sensors were irradiated to 6 different fluences (ranging from 1 to 100 dpa), followed by intensive characterization using focused ion beam (FIB), transmission electron microscopy (TEM) and scanning electron microscopy (SEM) to investigate the physical deformation and microstructural changes between different fluence levels, to provide high-resolution information regarding the material performance. Computer modeling (SRIM/TRIM) was employed to simulate damage to the sensor as well as to provide significant information concerning the penetration depth of the ions into the material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenge of detecting a change in the distribution of data is a sequential decision problem that is relevant to many engineering solutions, including quality control and machine and process monitoring. This dissertation develops techniques for exact solution of change-detection problems with discrete time and discrete observations. Change-detection problems are classified as Bayes or minimax based on the availability of information on the change-time distribution. A Bayes optimal solution uses prior information about the distribution of the change time to minimize the expected cost, whereas a minimax optimal solution minimizes the cost under the worst-case change-time distribution. Both types of problems are addressed. The most important result of the dissertation is the development of a polynomial-time algorithm for the solution of important classes of Markov Bayes change-detection problems. Existing techniques for epsilon-exact solution of partially observable Markov decision processes have complexity exponential in the number of observation symbols. A new algorithm, called constellation induction, exploits the concavity and Lipschitz continuity of the value function, and has complexity polynomial in the number of observation symbols. It is shown that change-detection problems with a geometric change-time distribution and identically- and independently-distributed observations before and after the change are solvable in polynomial time. Also, change-detection problems on hidden Markov models with a fixed number of recurrent states are solvable in polynomial time. A detailed implementation and analysis of the constellation-induction algorithm are provided. Exact solution methods are also established for several types of minimax change-detection problems. Finite-horizon problems with arbitrary observation distributions are modeled as extensive-form games and solved using linear programs. Infinite-horizon problems with linear penalty for detection delay and identically- and independently-distributed observations can be solved in polynomial time via epsilon-optimal parameterization of a cumulative-sum procedure. Finally, the properties of policies for change-detection problems are described and analyzed. Simple classes of formal languages are shown to be sufficient for epsilon-exact solution of change-detection problems, and methods for finding minimally sized policy representations are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The wide use of antibiotics in aquaculture has led to the emergence of resistant microbial species. It should be avoided/minimized by controlling the amount of drug employed in fish farming. For this purpose, the present work proposes test-strip papers aiming at the detection/semi-quantitative determination of organic drugs by visual comparison of color changes, in a similar analytical procedure to that of pH monitoring by universal pH paper. This is done by establishing suitable chemical changes upon cellulose, attributing the paper the ability to react with the organic drug and to produce a color change. Quantitative data is also enabled by taking a picture and applying a suitable mathematical treatment to the color coordinates given by the HSL system used by windows. As proof of concept, this approach was applied to oxytetracycline (OXY), one of the antibiotics frequently used in aquaculture. A bottom-up modification of paper was established, starting by the reaction of the glucose moieties on the paper with 3-triethoxysilylpropylamine (APTES). The so-formed amine layer allowed binding to a metal ion by coordination chemistry, while the metal ion reacted after with the drug to produce a colored compound. The most suitable metals to carry out such modification were selected by bulk studies, and the several stages of the paper modification were optimized to produce an intense color change against the concentration of the drug. The paper strips were applied to the analysis of spiked environmental water, allowing a quantitative determination for OXY concentrations as low as 30 ng/mL. In general, this work provided a simple, method to screen and discriminate tetracycline drugs, in aquaculture, being a promising tool for local, quick and cheap monitoring of drugs.