973 resultados para Prototype Verification System


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Health care providers face the problem of trying to make decisions with inadequate information and also with an overload of (often contradictory) information. Physicians often choose treatment long before they know which disease is present. Indeed, uncertainty is intrinsic to the practice of medicine. Decision analysis can help physicians structure and work through a medical decision problem, and can provide reassurance that decisions are rational and consistent with the beliefs and preferences of other physicians and patients. ^ The primary purpose of this research project is to develop the theory, methods, techniques and tools necessary for designing and implementing a system to support solving medical decision problems. A case study involving “abdominal pain” serves as a prototype for implementing the system. The research, however, focuses on a generic class of problems and aims at covering theoretical as well as practical aspects of the system developed. ^ The main contributions of this research are: (1) bridging the gap between the statistical approach and the knowledge-based (expert) approach to medical decision making; (2) linking a collection of methods, techniques and tools together to allow for the design of a medical decision support system, based on a framework that involves the Analytic Network Process (ANP), the generalization of the Analytic Hierarchy Process (AHP) to dependence and feedback, for problems involving diagnosis and treatment; (3) enhancing the representation and manipulation of uncertainty in the ANP framework by incorporating group consensus weights; and (4) developing a computer program to assist in the implementation of the system. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer vision-based food recognition could be used to estimate a meal's carbohydrate content for diabetic patients. This study proposes a methodology for automatic food recognition, based on the Bag of Features (BoF) model. An extensive technical investigation was conducted for the identification and optimization of the best performing components involved in the BoF architecture, as well as the estimation of the corresponding parameters. For the design and evaluation of the prototype system, a visual dataset with nearly 5,000 food images was created and organized into 11 classes. The optimized system computes dense local features, using the scale-invariant feature transform on the HSV color space, builds a visual dictionary of 10,000 visual words by using the hierarchical k-means clustering and finally classifies the food images with a linear support vector machine classifier. The system achieved classification accuracy of the order of 78%, thus proving the feasibility of the proposed approach in a very challenging image dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of liquid argon time projection chambers (LAr TPCs) are being built or are proposed for neutrino experiments on long- and short baseline beams. For these detectors, a distortion in the drift field due to geometrical or physics reasons can affect the reconstruction of the events. Depending on the TPC geometry and electric drift field intensity, this distortion could be of the same magnitude as the drift field itself. Recently, we presented a method to calibrate the drift field and correct for these possible distortions. While straight cosmic ray muon tracks could be used for calibration, multiple coulomb scattering and momentum uncertainties allow only a limited resolution. A UV laser instead can create straight ionization tracks in liquid argon, and allows one to map the drift field along different paths in the TPC inner volume. Here we present a UV laser feed-through design with a steerable UV mirror immersed in liquid argon that can point the laser beam at many locations through the TPC. The straight ionization paths are sensitive to drift field distortions, a fit of these distortion to the linear optical path allows to extract the drift field, by using these laser tracks along the whole TPC volume one can obtain a 3D drift field map. The UV laser feed-through assembly is a prototype of the system that will be used for the MicroBooNE experiment at the Fermi National Accelerator Laboratory (FNAL).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Diabetes mellitus is spreading throughout the world and diabetic individuals have been shown to often assess their food intake inaccurately; therefore, it is a matter of urgency to develop automated diet assessment tools. The recent availability of mobile phones with enhanced capabilities, together with the advances in computer vision, have permitted the development of image analysis apps for the automated assessment of meals. GoCARB is a mobile phone-based system designed to support individuals with type 1 diabetes during daily carbohydrate estimation. In a typical scenario, the user places a reference card next to the dish and acquires two images using a mobile phone. A series of computer vision modules detect the plate and automatically segment and recognize the different food items, while their 3D shape is reconstructed. Finally, the carbohydrate content is calculated by combining the volume of each food item with the nutritional information provided by the USDA Nutrient Database for Standard Reference. Objective: The main objective of this study is to assess the accuracy of the GoCARB prototype when used by individuals with type 1 diabetes and to compare it to their own performance in carbohydrate counting. In addition, the user experience and usability of the system is evaluated by questionnaires. Methods: The study was conducted at the Bern University Hospital, “Inselspital” (Bern, Switzerland) and involved 19 adult volunteers with type 1 diabetes, each participating once. Each study day, a total of six meals of broad diversity were taken from the hospital’s restaurant and presented to the participants. The food items were weighed on a standard balance and the true amount of carbohydrate was calculated from the USDA nutrient database. Participants were asked to count the carbohydrate content of each meal independently and then by using GoCARB. At the end of each session, a questionnaire was completed to assess the user’s experience with GoCARB. Results: The mean absolute error was 27.89 (SD 38.20) grams of carbohydrate for the estimation of participants, whereas the corresponding value for the GoCARB system was 12.28 (SD 9.56) grams of carbohydrate, which was a significantly better performance ( P=.001). In 75.4% (86/114) of the meals, the GoCARB automatic segmentation was successful and 85.1% (291/342) of individual food items were successfully recognized. Most participants found GoCARB easy to use. Conclusions: This study indicates that the system is able to estimate, on average, the carbohydrate content of meals with higher accuracy than individuals with type 1 diabetes can. The participants thought the app was useful and easy to use. GoCARB seems to be a well-accepted supportive mHealth tool for the assessment of served-on-a-plate meals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of operations on representations of objects is well documented in the realm of spatial engineering. However, the mathematical structure and formal proof of these operational phenomena are not thoroughly explored. Other works have often focused on query-based models that seek to order classes and instances of objects in the form of semantic hierarchies or graphs. In some models, nodes of graphs represent objects and are connected by edges that represent different types of coarsening operators. This work, however, studies how the coarsening operator "simplification" can manipulate partitions of finite sets, independent from objects and their attributes. Partitions that are "simplified first have a collection of elements filtered (removed), and then the remaining partition is amalgamated (some sub-collections are unified). Simplification has many interesting mathematical properties. A finite composition of simplifications can also be accomplished with some single simplification. Also, if one partition is a simplification of the other, the simplified partition is defined to be less than the other partition according to the simp relation. This relation is shown to be a partial-order relation based on simplification. Collections of partitions can not only be proven to have a partial- order structure, but also have a lattice structure and are complete. In regard to a geographic information system (GIs), partitions related to subsets of attribute domains for objects are called views. Objects belong to different views based whether or not their attribute values lie in the underlying view domain. Given a particular view, objects with their attribute n-tuple codings contained in the view are part of the actualization set on views, and objects are labeled according to the particular subset of the view in which their coding lies. Though the scope of the work does not mainly focus on queries related directly to geographic objects, it provides verification for the existence of particular views in a system with this underlying structure. Given a finite attribute domain, one can say with mathematical certainty that different views of objects are partially ordered by simplification, and every collection of views has a greatest lower bound and least upper bound, which provides the validity for exploring queries in this regard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research and development project was to develop a method, a design, and a prototype for gathering, managing, and presenting data about occupational injuries.^ State-of-the-art systems analysis and design methodologies were applied to the long standing problem in the field of occupational safety and health of processing workplace injuries data into information for safety and health program management as well as preliminary research about accident etiologies. The top-down planning and bottom-up implementation approach was utilized to design an occupational injury management information system. A description of a managerial control system and a comprehensive system to integrate safety and health program management was provided.^ The project showed that current management information systems (MIS) theory and methods could be applied successfully to the problems of employee injury surveillance and control program performance evaluation. The model developed in the first section was applied at The University of Texas Health Science Center at Houston (UTHSCH).^ The system in current use at the UTHSCH was described and evaluated, and a prototype was developed for the UTHSCH. The prototype incorporated procedures for collecting, storing, and retrieving records of injuries and the procedures necessary to prepare reports, analyses, and graphics for management in the Health Science Center. Examples of reports, analyses, and graphics presenting UTHSCH and computer generated data were included.^ It was concluded that a pilot test of this MIS should be implemented and evaluated at the UTHSCH and other settings. Further research and development efforts for the total safety and health management information systems, control systems, component systems, and variable selection should be pursued. Finally, integration of the safety and health program MIS into the comprehensive or executive MIS was recommended. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early Employee Assistance Programs (EAPs) had their origin in humanitarian motives, and there was little concern for their cost/benefit ratios; however, as some programs began accumulating data and analyzing it over time, even with single variables such as absenteeism, it became apparent that the humanitarian reasons for a program could be reinforced by cost savings particularly when the existence of the program was subject to justification.^ Today there is general agreement that cost/benefit analyses of EAPs are desirable, but the specific models for such analyses, particularly those making use of sophisticated but simple computer based data management systems, are few.^ The purpose of this research and development project was to develop a method, a design, and a prototype for gathering managing and presenting information about EAPS. This scheme provides information retrieval and analyses relevant to such aspects of EAP operations as: (1) EAP personnel activities, (2) Supervisory training effectiveness, (3) Client population demographics, (4) Assessment and Referral Effectiveness, (5) Treatment network efficacy, (6) Economic worth of the EAP.^ This scheme has been implemented and made operational at The University of Texas Employee Assistance Programs for more than three years.^ Application of the scheme in the various programs has defined certain variables which remained necessary in all programs. Depending on the degree of aggressiveness for data acquisition maintained by program personnel, other program specific variables are also defined. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this work was to develop a comprehensive IMSRT QA procedure that examined, using EPID dosimetry and Monte Carlo (MC) calculations, each step in the treatment planning and delivery process. These steps included verification of the field shaping, treatment planning system (RTPS) dose calculations, and patient dose delivery. Verification of each step in the treatment process is assumed to result in correct dose delivery to the patient. ^ The accelerator MC model was verified against commissioning data for field sizes from 0.8 × 0.8 cm 2 to 10 × 10 cm 2. Depth doses were within 2% local percent difference (LPD) in low gradient regions and 1 mm distance to agreement (DTA) in high gradient regions. Lateral profiles were within 2% LPD in low gradient regions and 1 mm DTA in high gradient regions. Calculated output factors were within 1% of measurement for field sizes ≥1 × 1 cm2. ^ The measured and calculated pretreatment EPID dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Pretreatment field verification resulted in 97% percent of the points passing. ^ The RTPS and Monte Carlo phantom dose calculations were compared using 5% LPD, 2 mm DTA, or 2% of the maximum dose with ≥95% of compared points required passing for successful verification. RTPS calculation verification resulted in 97% percent of the points passing. ^ The measured and calculated EPID exit dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Exit dose verification resulted in 97% percent of the points passing. ^ Each of the processes above verified an individual step in the treatment planning and delivery process. The combination of these verification steps ensures accurate treatment delivery to the patient. This work shows that Monte Carlo calculations and EPID dosimetry can be used to quantitatively verify IMSRT treatments resulting in improved patient care and, potentially, improved clinical outcome. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Traditional patient-specific IMRT QA measurements are labor intensive and consume machine time. Calculation-based IMRT QA methods typically are not comprehensive. We have developed a comprehensive calculation-based IMRT QA method to detect uncertainties introduced by the initial dose calculation, the data transfer through the Record-and-Verify (R&V) system, and various aspects of the physical delivery. Methods: We recomputed the treatment plans in the patient geometry for 48 cases using data from the R&V, and from the delivery unit to calculate the “as-transferred” and “as-delivered” doses respectively. These data were sent to the original TPS to verify transfer and delivery or to a second TPS to verify the original calculation. For each dataset we examined the dose computed from the R&V record (RV) and from the delivery records (Tx), and the dose computed with a second verification TPS (vTPS). Each verification dose was compared to the clinical dose distribution using 3D gamma analysis and by comparison of mean dose and ROI-specific dose levels to target volumes. Plans were also compared to IMRT QA absolute and relative dose measurements. Results: The average 3D gamma passing percentages using 3%-3mm, 2%-2mm, and 1%-1mm criteria for the RV plan were 100.0 (σ=0.0), 100.0 (σ=0.0), and 100.0 (σ=0.1); for the Tx plan they were 100.0 (σ=0.0), 100.0 (σ=0.0), and 99.0 (σ=1.4); and for the vTPS plan they were 99.3 (σ=0.6), 97.2 (σ=1.5), and 79.0 (σ=8.6). When comparing target volume doses in the RV, Tx, and vTPS plans to the clinical plans, the average ratios of ROI mean doses were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.990 (σ=0.009) and ROI-specific dose levels were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.980 (σ=0.043), respectively. Comparing the clinical, RV, TR, and vTPS calculated doses to the IMRT QA measurements for all 48 patients, the average ratios for absolute doses were 0.999 (σ=0.013), 0.998 (σ=0.013), 0.999 σ=0.015), and 0.990 (σ=0.012), respectively, and the average 2D gamma(5%-3mm) passing percentages for relative doses for 9 patients was were 99.36 (σ=0.68), 99.50 (σ=0.49), 99.13 (σ=0.84), and 98.76 (σ=1.66), respectively. Conclusions: Together with mechanical and dosimetric QA, our calculation-based IMRT QA method promises to minimize the need for patient-specific QA measurements by identifying outliers in need of further review.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In preparation for the Russian Luna-Resurs mission we combined our compact time-of-flight mass spectrometer (TOF-MS) with a chemical pre-separation of the species by gas chromatography (GC). Coupled measurements with both instruments were successfully performed with the prototype of the mass spectrometer and a flight-like gas chromatograph. The system was tested with two test gas mixtures, a mixture of hydrocarbons and a mixture of noble gases. Due to its capability to record mass spectra over the full mass range at once with high sensitivity and a dynamic range of up to 10(6) within 1 s, the TOF-MS system is a valuable extension of the GC analytical system. Based on the measurements with calibration gases performed with the combined GC-MS prototype and under assumption of mean characteristics for the Moon's regolith, the detection limit for volatile species in a soil sample is estimated to 2.10(-10) by mass for hydrocarbons and 2.10(-9) by mass for noble gases. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel HCPV nonimaging concentrator concept with high concentration (>500×) is presented. It uses the combination of a commercial concentration GaInP∕GaInAs∕Ge 3J cell and a concentration Back‐Point‐Contact (BPC) concentration silicon cell for efficient spectral utilization, and external confinement techniques for recovering the 3J cell′s reflection. The primary optical element (POE) is a flat Fresnel lens and the secondary optical element (SOE) is a free‐form RXI‐type concentrator with a band‐pass filter embedded it, both POE and SOE performing Köhler integration to produce light homogenization. The band‐pass filter sends the IR photons in the 900–1200 nm band to the silicon cell. Computer simulations predict that four‐terminal terminal designs could achieve ∼46% added cell efficiencies using commercial 39% 3J and 26% Si cells. A first proof‐of concept receiver prototype has been manufactured using a simpler optical architecture (with a lower concentration, ∼ 100× and lower simulated added efficiency), and experimental measurements have shown up to 39.8% 4J receiver efficiency using a 3J with peak efficiency of 36.9%

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2008, the City Council of Rivas-Vaciamadrid (Spain) decided to promote the construction of “Rivasecopolis”, a complex of sustainable buildings in which a new prototype of a zero-energy house would become the office of the Energy Agency. According to the initiative of the City Council, it was decided to recreate the dwelling prototype “Magic-box” which entered the 2005 Solar Decathlon Competition. The original project has been adapted to a new necessities programme, by adding the necessary spaces that allows it to work as an office. A team from university has designed and carried out the direction of the construction site. The new Solar House is conceived as a “testing building”. It is going to become the space for attending citizens in all questions about saving energy, energy efficiency and sustainable construction, having a permanent small exhibition space additional to the working places for the information purpose. At the same time, the building includes the use of experimental passive architecture systems and a monitoring and control system. Collected data will be sent to University to allow developing research work about the experimental strategies included in the building. This paper will describe and analyze the experience of transforming a prototype into a real durable building and the benefits for both university and citizens in learning about sustainability with the building

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is based on the prototype High Engineering Test Reactor (HTTR) of the Japan Agency of Energy Atomic (JAEA). Its objective is to describe an adequate deterministic model to be used in the assessment of its design safety margins via damage domains. The concept of damage domain is defined and it is shown its relevance in the ongoing effort to apply dynamic risk assessment methods and tools based on the Theory of Stimulated Dynamics (TSD). To illustrate, we present results of an abnormal control rod (CR) withdrawal during subcritical condition and its comparison with results obtained by JAEA. No attempt is made yet to actually assess the detailed scenarios, rather to show how the approach may handle events of its kind

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although several profiling techniques for identifying performance bottlenecks in logic programs have been developed, they are generally not automatic and in most cases they do not provide enough information for identifying the root causes of such bottlenecks. This complicates using their results for guiding performance improvement. We present a profiling method and tool that provides such explanations. Our profiler associates cost centers to certain program elements and can measure different types of resource-related properties that affect performance, preserving the precedence of cost centers in the cali graph. It includes an automatic method for detecting procedures that are performance bottlenecks. The profiling tool has been integrated in a previously developed run-time checking framework to allow verification of certain properties when they cannot be verified statically. The approach allows checking global computational properties which require complex instrumentation tracking information about previous execution states, such as, e.g., that the execution time accumulated by a given procedure is not greater than a given bound. We have built a prototype implementation, integrated it in the Ciao/CiaoPP system and successfully applied it to performance improvement, automatic optimization (e.g., resource-aware specialization of programs), run-time checking, and debugging of global computational properties (e.g., resource usage) in Prolog programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional postconditions, properties at arbitrary program points, and certain computational properties. The implemented transformation includes several optimizations to reduce run-time overhead. We also propose a minimal addition to the assertion language which allows defining unit tests to be run in order to detect possible violations of the (partial) specifications expressed by the assertions. This language can express for example the input data for performing the unit tests or the number of times that the unit tests should be repeated. We have implemented the framework within the Ciao/CiaoPP system and effectively applied it to the verification of ISO-prolog compliance and to the detection of different types of bugs in the Ciao system source code. Several experimental results are presented that ¡Ilústrate different trade-offs among program size, running time, or levéis of verbosity of the messages shown to the user.