903 resultados para game design techniques
Resumo:
It is sometimes unquantifiable how hard it is for most people to deal with game addiction. Several articles have equally been published to address this subject, some suggesting the concept of Educational and serious games. Similarly, researchers have revealed that it does not come easy learning a subject like math. This is where the illusive world of computer games comes in. It is amazing how much people learn from games. In this paper, we have designed and programmed a simple PC math game that teaches rudimentary topics in mathematics.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.
Resumo:
Continuous advancements in technology have led to increasingly comprehensive and distributed product development processes while in pursuit of improved products at reduced costs. Information associated with these products is ever changing, and structured frameworks have become integral to managing such fluid information. Ontologies and the Semantic Web have emerged as key alternatives for capturing product knowledge in both a human-readable and computable manner. The primary and conclusive focus of this research is to characterize relationships formed within methodically developed distributed design knowledge frameworks to ultimately provide a pervasive real-time awareness in distributed design processes. Utilizing formal logics in the form of the Semantic Web’s OWL and SWRL, causal relationships are expressed to guide and facilitate knowledge acquisition as well as identify contradictions between knowledge in a knowledge base. To improve the efficiency during both the development and operational phases of these “intelligent” frameworks, a semantic relatedness algorithm is designed specifically to identify and rank underlying relationships within product development processes. After reviewing several semantic relatedness measures, three techniques, including a novel meronomic technique, are combined to create AIERO, the Algorithm for Identifying Engineering Relationships in Ontologies. In determining its applicability and accuracy, AIERO was applied to three separate, independently developed ontologies. The results indicate AIERO is capable of consistently returning relatedness values one would intuitively expect. To assess the effectiveness of AIERO in exposing underlying causal relationships across product development platforms, a case study involving the development of an industry-inspired printed circuit board (PCB) is presented. After instantiating the PCB knowledge base and developing an initial set of rules, FIDOE, the Framework for Intelligent Distributed Ontologies in Engineering, was employed to identify additional causal relationships through extensional relatedness measurements. In a conclusive PCB redesign, the resulting “intelligent” framework demonstrates its ability to pass values between instances, identify inconsistencies amongst instantiated knowledge, and identify conflicting values within product development frameworks. The results highlight how the introduced semantic methods can enhance the current knowledge acquisition, knowledge management, and knowledge validation capabilities of traditional knowledge bases.
Resumo:
Introduction: Since the introduction and evolution of laparoscopic surgery, there have been some concerns related to surgical training in this field. Laparoscopic box trainers and virtual simulators appear as useful devices which have been demonstrating effectiveness in learning surgical skills. However, these tools remain inaccessible for many centers around the world. Our intent is to share our experience in successful design to inspire others in surgical residency programs to build such boxes for training in laparoscopic techniques and also to encourage the use of simulators in educational centers. [See PDF for complete abstract]
Resumo:
Introduction: The United States is currently experiencing increased prevalence of obesity. This is a particular problem amongst children who require dietary and activity behavioral change to mitigate this problem. The use of computer games as channels to motivate health behavior in children is increasing. Casual games are a subset of computer games that are simple in design, easy to access and play, popular with children, and have the potential to be effective for drill and practice learning. [See PDF for complete abstract]
Resumo:
In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^
Resumo:
Health care providers face the problem of trying to make decisions with inadequate information and also with an overload of (often contradictory) information. Physicians often choose treatment long before they know which disease is present. Indeed, uncertainty is intrinsic to the practice of medicine. Decision analysis can help physicians structure and work through a medical decision problem, and can provide reassurance that decisions are rational and consistent with the beliefs and preferences of other physicians and patients. ^ The primary purpose of this research project is to develop the theory, methods, techniques and tools necessary for designing and implementing a system to support solving medical decision problems. A case study involving “abdominal pain” serves as a prototype for implementing the system. The research, however, focuses on a generic class of problems and aims at covering theoretical as well as practical aspects of the system developed. ^ The main contributions of this research are: (1) bridging the gap between the statistical approach and the knowledge-based (expert) approach to medical decision making; (2) linking a collection of methods, techniques and tools together to allow for the design of a medical decision support system, based on a framework that involves the Analytic Network Process (ANP), the generalization of the Analytic Hierarchy Process (AHP) to dependence and feedback, for problems involving diagnosis and treatment; (3) enhancing the representation and manipulation of uncertainty in the ANP framework by incorporating group consensus weights; and (4) developing a computer program to assist in the implementation of the system. ^
Resumo:
PURPOSE Hodgkin lymphoma (HL) is a highly curable disease. Reducing late complications and second malignancies has become increasingly important. Radiotherapy target paradigms are currently changing and radiotherapy techniques are evolving rapidly. DESIGN This overview reports to what extent target volume reduction in involved-node (IN) and advanced radiotherapy techniques, such as intensity-modulated radiotherapy (IMRT) and proton therapy-compared with involved-field (IF) and 3D radiotherapy (3D-RT)- can reduce high doses to organs at risk (OAR) and examines the issues that still remain open. RESULTS Although no comparison of all available techniques on identical patient datasets exists, clear patterns emerge. Advanced dose-calculation algorithms (e.g., convolution-superposition/Monte Carlo) should be used in mediastinal HL. INRT consistently reduces treated volumes when compared with IFRT with the exact amount depending on the INRT definition. The number of patients that might significantly benefit from highly conformal techniques such as IMRT over 3D-RT regarding high-dose exposure to organs at risk (OAR) is smaller with INRT. The impact of larger volumes treated with low doses in advanced techniques is unclear. The type of IMRT used (static/rotational) is of minor importance. All advanced photon techniques result in similar potential benefits and disadvantages, therefore only the degree-of-modulation should be chosen based on individual treatment goals. Treatment in deep inspiration breath hold is being evaluated. Protons theoretically provide both excellent high-dose conformality and reduced integral dose. CONCLUSION Further reduction of treated volumes most effectively reduces OAR dose, most likely without disadvantages if the excellent control rates achieved currently are maintained. For both IFRT and INRT, the benefits of advanced radiotherapy techniques depend on the individual patient/target geometry. Their use should therefore be decided case by case with comparative treatment planning.
Resumo:
OBJECTIVE A number of factors limit the effectiveness of current aortic arch studies in assessing optimal neuroprotection strategies, including insufficient patient numbers, heterogenous definitions of clinical variables, multiple technical strategies, inadequate reporting of surgical outcomes and a lack of collaborative effort. We have formed an international coalition of centres to provide more robust investigations into this topic. METHODS High-volume aortic arch centres were identified from the literature and contacted for recruitment. A Research Steering Committee of expert arch surgeons was convened to oversee the direction of the research. RESULTS The International Aortic Arch Surgery Study Group has been formed by 41 arch surgeons from 10 countries to better evaluate patient outcomes after aortic arch surgery. Several projects, including the establishment of a multi-institutional retrospective database, randomized controlled trials and a prospectively collected database, are currently underway. CONCLUSIONS Such a collaborative effort will herald a turning point in the surgical management of aortic arch pathologies and will provide better powered analyses to assess the impact of varying surgical techniques on mortality and morbidity, identify predictors for neurological and operative risk, formulate and validate risk predictor models and review long-term survival outcomes and quality-of-life after arch surgery.
Resumo:
Electricity markets in the United States presently employ an auction mechanism to determine the dispatch of power generation units. In this market design, generators submit bid prices to a regulation agency for review, and the regulator conducts an auction selection in such a way that satisfies electricity demand. Most regulators currently use an auction selection method that minimizes total offer costs ["bid cost minimization" (BCM)] to determine electric dispatch. However, recent literature has shown that this method may not minimize consumer payments, and it has been shown that an alternative selection method that directly minimizes total consumer payments ["payment cost minimization" (PCM)] may benefit social welfare in the long term. The objective of this project is to further investigate the long term benefit of PCM implementation and determine whether it can provide lower costs to consumers. The two auction selection methods are expressed as linear constraint programs and are implemented in an optimization software package. Methodology for game theoretic bidding simulation is developed using EMCAS, a real-time market simulator. Results of a 30-day simulation showed that PCM reduced energy costs for consumers by 12%. However, this result will be cross-checked in the future with two other methods of bid simulation as proposed in this paper.
Resumo:
This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the central bank should adopt a loss function that differs from the social loss function. Carefully designing the central bank s loss function with consistent targets can harmonize optimal and consistent policy. This desirable result emerges from two observations. First, the social loss function reflects a normative process that does not necessarily prove consistent with the structure of the microeconomy. Thus, the social loss function cannot serve as a direct loss function for the central bank. Second, an optimal loss function for the central bank must depend on the structure of that microeconomy. In addition, this paper shows that control theory provides a benchmark for institution design in a game-theoretical framework.
Resumo:
This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the social loss function cannot serve as a direct loss function for the central bank. Accordingly, we employ implementation theory to design a central bank loss function (mechanism design) with consistent targets, while the social loss function serves as a social welfare criterion. That is, with the correct mechanism design for the central bank loss function, optimal policy and consistent policy become identical. In other words, optimal policy proves implementable (consistent).
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
Mixed longitudinal designs are important study designs for many areas of medical research. Mixed longitudinal studies have several advantages over cross-sectional or pure longitudinal studies, including shorter study completion time and ability to separate time and age effects, thus are an attractive choice. Statistical methodology used in general longitudinal studies has been rapidly developing within the last few decades. Common approaches for statistical modeling in studies with mixed longitudinal designs have been the linear mixed-effects model incorporating an age or time effect. The general linear mixed-effects model is considered an appropriate choice to analyze repeated measurements data in longitudinal studies. However, common use of linear mixed-effects model on mixed longitudinal studies often incorporates age as the only random-effect but fails to take into consideration the cohort effect in conducting statistical inferences on age-related trajectories of outcome measurements. We believe special attention should be paid to cohort effects when analyzing data in mixed longitudinal designs with multiple overlapping cohorts. Thus, this has become an important statistical issue to address. ^ This research aims to address statistical issues related to mixed longitudinal studies. The proposed study examined the existing statistical analysis methods for the mixed longitudinal designs and developed an alternative analytic method to incorporate effects from multiple overlapping cohorts as well as from different aged subjects. The proposed study used simulation to evaluate the performance of the proposed analytic method by comparing it with the commonly-used model. Finally, the study applied the proposed analytic method to the data collected by an existing study Project HeartBeat!, which had been evaluated using traditional analytic techniques. Project HeartBeat! is a longitudinal study of cardiovascular disease (CVD) risk factors in childhood and adolescence using a mixed longitudinal design. The proposed model was used to evaluate four blood lipids adjusting for age, gender, race/ethnicity, and endocrine hormones. The result of this dissertation suggest the proposed analytic model could be a more flexible and reliable choice than the traditional model in terms of fitting data to provide more accurate estimates in mixed longitudinal studies. Conceptually, the proposed model described in this study has useful features, including consideration of effects from multiple overlapping cohorts, and is an attractive approach for analyzing data in mixed longitudinal design studies.^
Resumo:
High-resolution, small-bore PET systems suffer from a tradeoff between system sensitivity, and image quality degradation. In these systems long crystals allow mispositioning of the line of response due to parallax error and this mispositioning causes resolution blurring, but long crystals are necessary for high system sensitivity. One means to allow long crystals without introducing parallax errors is to determine the depth of interaction (DOI) of the gamma ray interaction within the detector module. While DOI has been investigated previously, newly available solid state photomultipliers (SSPMs) well-suited to PET applications and allow new modules for investigation. Depth of interaction in full modules is a relatively new field, and so even if high performance DOI capable modules were available, the appropriate means to characterize and calibrate the modules are not. This work presents an investigation of DOI capable arrays and techniques for characterizing and calibrating those modules. The methods introduced here accurately and reliably characterize and calibrate energy, timing, and event interaction positioning. Additionally presented is a characterization of the spatial resolution of DOI capable modules and a measurement of DOI effects for different angles between detector modules. These arrays have been built into a prototype PET system that delivers better than 2.0 mm resolution with a single-sided-stopping-power in excess of 95% for 511 keV g's. The noise properties of SSPMs scale with the active area of the detector face, and so the best signal-to-noise ratio is possible with parallel readout of each SSPM photodetector pixel rather than multiplexing signals together. This work additionally investigates several algorithms for improving timing performance using timing information from multiple SSPM pixels when light is distributed among several photodetectors.