80 resultados para Computer game -- Programming
Resumo:
There is a lack of dedicated tools for business model design at a strategic level. However, in today's economic world the need to be able to quickly reinvent a company's business model is essential to stay competitive. This research focused on identifying the functionalities that are necessary in a computer-aided design (CAD) tool for the design of business models in a strategic context. Using design science research methodology a series of techniques and prototypes have been designed and evaluated to offer solutions to the problem. The work is a collection of articles which can be grouped into three parts: First establishing the context of how the Business Model Canvas (BMC) is used to design business models and explore the way in which CAD can contribute to the design activity. The second part extends on this by proposing new technics and tools which support elicitation, evaluation (assessment) and evolution of business models design with CAD. This includes features such as multi-color tagging to easily connect elements, rules to validate coherence of business models and features that are adapted to the correct business model proficiency level of its users. A new way to describe and visualize multiple versions of a business model and thereby help in addressing the business model as a dynamic object was also researched. The third part explores extensions to the business model canvas such as an intermediary model which helps IT alignment by connecting business model and enterprise architecture. And a business model pattern for privacy in a mobile environment, using privacy as a key value proposition. The prototyped techniques and proposition for using CAD tools in business model modeling will allow commercial CAD developers to create tools that are better suited to the needs of practitioners.
Resumo:
BACKGROUND: Surveillance of multiple congenital anomalies is considered to be more sensitive for the detection of new teratogens than surveillance of all or isolated congenital anomalies. Current literature proposes the manual review of all cases for classification into isolated or multiple congenital anomalies. METHODS: Multiple anomalies were defined as two or more major congenital anomalies, excluding sequences and syndromes. A computer algorithm for classification of major congenital anomaly cases in the EUROCAT database according to International Classification of Diseases (ICD)v10 codes was programmed, further developed, and implemented for 1 year's data (2004) from 25 registries. The group of cases classified with potential multiple congenital anomalies were manually reviewed by three geneticists to reach a final agreement of classification as "multiple congenital anomaly" cases. RESULTS: A total of 17,733 cases with major congenital anomalies were reported giving an overall prevalence of major congenital anomalies at 2.17%. The computer algorithm classified 10.5% of all cases as "potentially multiple congenital anomalies". After manual review of these cases, 7% were agreed to have true multiple congenital anomalies. Furthermore, the algorithm classified 15% of all cases as having chromosomal anomalies, 2% as monogenic syndromes, and 76% as isolated congenital anomalies. The proportion of multiple anomalies varies by congenital anomaly subgroup with up to 35% of cases with bilateral renal agenesis. CONCLUSIONS: The implementation of the EUROCAT computer algorithm is a feasible, efficient, and transparent way to improve classification of congenital anomalies for surveillance and research.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
The Learning Affect Monitor (LAM) is a new computer-based assessment system integrating basic dimensional evaluation and discrete description of affective states in daily life, based on an autonomous adapting system. Subjects evaluate their affective states according to a tridimensional space (valence and activation circumplex as well as global intensity) and then qualify it using up to 30 adjective descriptors chosen from a list. The system gradually adapts to the user, enabling the affect descriptors it presents to be increasingly relevant. An initial study with 51 subjects, using a 1 week time-sampling with 8 to 10 randomized signals per day, produced n = 2,813 records with good reliability measures (e.g., response rate of 88.8%, mean split-half reliability of .86), user acceptance, and usability. Multilevel analyses show circadian and hebdomadal patterns, and significant individual and situational variance components of the basic dimension evaluations. Validity analyses indicate sound assignment of qualitative affect descriptors in the bidimensional semantic space according to the circumplex model of basic affect dimensions. The LAM assessment module can be implemented on different platforms (palm, desk, mobile phone) and provides very rapid and meaningful data collection, preserving complex and interindividually comparable information in the domain of emotion and well-being.
Resumo:
Positron emission computed tomography (PET) is a functional, noninvasive method for imaging regional metabolic processes that is nowadays most often combined to morphological imaging with computed tomography (CT). Its use is based on the well-founded assumption that metabolic changes occur earlier in tumors than morphologic changes, adding another dimension to imaging. This article will review the established and investigational indications and radiopharmaceuticals for PET/CT imaging for prostate cancer, bladder cancer and testicular cancer, before presenting upcoming applications in radiation therapy.
Resumo:
High altitude constitutes an exciting natural laboratory for medical research. While initially, the aim of high-altitude research was to understand the adaptation of the organism to hypoxia and find treatments for altitude-related diseases, over the past decade or so, the scope of this research has broadened considerably. Two important observations led to the foundation for the broadening of the scientific scope of high-altitude research. First, high-altitude pulmonary edema (HAPE) represents a unique model which allows studying fundamental mechanisms of pulmonary hypertension and lung edema in humans. Secondly, the ambient hypoxia associated with high-altitude exposure facilitates the detection of pulmonary and systemic vascular dysfunction at an early stage. Here, we review studies that, by capitalizing on these observations, have led to the description of novel mechanisms underpinning lung edema and pulmonary hypertension and to the first direct demonstration of fetal programming of vascular dysfunction in humans.
Resumo:
Computer-Aided Tomography Angiography (CTA) images are the standard for assessing Peripheral artery disease (PAD). This paper presents a Computer Aided Detection (CAD) and Computer Aided Measurement (CAM) system for PAD. The CAD stage detects the arterial network using a 3D region growing method and a fast 3D morphology operation. The CAM stage aims to accurately measure the artery diameters from the detected vessel centerline, compensating for the partial volume effect using Expectation Maximization (EM) and a Markov Random field (MRF). The system has been evaluated on phantom data and also applied to fifteen (15) CTA datasets, where the detection accuracy of stenosis was 88% and the measurement accuracy was with an 8% error.
A biophysical model of atrial fibrillation ablation: what can a surgeon learn from a computer model?
Resumo:
AIMS: Surgical ablation procedures for treating atrial fibrillation have been shown to be highly successful. However, the ideal ablation pattern still remains to be determined. This article reports on a systematic study of the effectiveness of the performance of different ablation line patterns. METHODS AND RESULTS: This study of ablation line patterns was performed in a biophysical model of human atria by combining basic lines: (i) in the right atrium: isthmus line, line between vena cavae and appendage line and (ii) in the left atrium: several versions of pulmonary vein isolation, connection of pulmonary veins, isthmus line, and appendage line. Success rates and the presence of residual atrial flutter were documented. Basic patterns yielded conversion rates of only 10-25 and 10-55% in the right and the left atria, respectively. The best result for pulmonary vein isolation was obtained when a single closed line encompassed all veins (55%). Combination of lines in the right/left atrium only led to a success rate of 65/80%. Higher rates, up to 90-100%, could be obtained if right and left lines were combined. The inclusion of a left isthmus line was found to be essential for avoiding uncommon left atrial flutter. CONCLUSION: Some patterns studied achieved a high conversion rate, although using a smaller number of lines than those of the Maze III procedure. The biophysical atrial model is shown to be effective in the search for promising alternative ablation strategies.
Resumo:
In order to understand the development of non-genetically encoded actions during an animal's lifespan, it is necessary to analyze the dynamics and evolution of learning rules producing behavior. Owing to the intrinsic stochastic and frequency-dependent nature of learning dynamics, these rules are often studied in evolutionary biology via agent-based computer simulations. In this paper, we show that stochastic approximation theory can help to qualitatively understand learning dynamics and formulate analytical models for the evolution of learning rules. We consider a population of individuals repeatedly interacting during their lifespan, and where the stage game faced by the individuals fluctuates according to an environmental stochastic process. Individuals adjust their behavioral actions according to learning rules belonging to the class of experience-weighted attraction learning mechanisms, which includes standard reinforcement and Bayesian learning as special cases. We use stochastic approximation theory in order to derive differential equations governing action play probabilities, which turn out to have qualitative features of mutator-selection equations. We then perform agent-based simulations to find the conditions where the deterministic approximation is closest to the original stochastic learning process for standard 2-action 2-player fluctuating games, where interaction between learning rules and preference reversal may occur. Finally, we analyze a simplified model for the evolution of learning in a producer-scrounger game, which shows that the exploration rate can interact in a non-intuitive way with other features of co-evolving learning rules. Overall, our analyses illustrate the usefulness of applying stochastic approximation theory in the study of animal learning.
Resumo:
The purpose of this study is to clinically validate a new two-dimensional preoperative planning software for cementless total hip arthroplasty (THA). Manual and two-dimensional computer-assisted planning were compared by an independent observer for each of the 30 patients with osteoarthritis who underwent THA. This study showed that there were no statistical differences between the results of both preoperative plans in terms of stem size and neck length (<1 size) and hip rotation center position (<5 mm). Two-dimensional computer-assisted preoperative planning provided successful results comparable to those using the manual procedure, thereby allowing the surgeon to simulate various stem designs easily.
Resumo:
PURPOSE: To prospectively evaluate the accuracy and reliability of "freehand" posttraumatic orbital wall reconstruction with AO (Arbeitsgemeinschaft Osteosynthese) titanium mesh plates by using computer-aided volumetric measurement of the bony orbits. METHODS: Bony orbital volume was measured in 12 patients from coronal CT scan slices using OsiriX Medical Image software. After defining the volumetric limits of the orbit, the segmentation of the bony orbital region of interest of each single slice was performed. At the end of the segmentation process, all regions of interest were grouped and the volume was computed. The same procedure was performed on both orbits, and thereafter the volume of the contralateral uninjured orbit was used as a control for comparison. RESULTS: In all patients, the volume data of the reconstructed orbit fitted that of the contralateral uninjured orbit with accuracy to within 1.85 cm3 (7%). CONCLUSIONS: This preliminary study has demonstrated that posttraumatic orbital wall reconstruction using "freehand" bending and placement of AO titanium mesh plates results in a high success rate in re-establishing preoperative bony volume, which closely approximates that of the contralateral uninjured orbit.