932 resultados para computer system emulation, multiprocessors, educational computer systems
Resumo:
BACKGROUND AND OBJECTIVES: Experimental assessment of photodynamic therapy (PDT) for malignant pleural mesothelioma using a polyethylene glycol conjugate of meta-tetrahydroxyphenylchlorin (PEG-mTHPC). STUDY DESIGN/MATERIALS AND METHODS: (a) PDT was tested on H-meso-1 xenografts (652 nm laser light; fluence 10 J/cm(2); 0.93, 9.3, or 27.8 mg/kg of PEG-mTHPC; drug-light intervals 3-8 days). (b) Intraoperative PDT with similar treatment conditions was performed in the chest cavity of minipigs (n = 18) following extrapleural pneumonectomy (EPP) using an optical integrating balloon device combined with in situ light dosimetry. RESULTS: (a) PDT using PEG-mTHPC resulted in larger extent of tumor necrosis than in untreated tumors (P < or = 0.01) without causing damage to normal tissue. (b) Intraoperative PDT following EPP was well tolerated in 17 of 18 animals. Mean fluence and fluence rates measured at four sites of the chest cavity ranged from 10.2 +/- 0.2 to 13.2 +/- 2.3 J/cm(2) and 5.5 +/- 1.2 to 7.9 +/- 1.7 mW/cm(2) (mean +/- SD). Histology 3 months after light delivery revealed no PDT related tissue injury in all but one animal. CONCLUSIONS: PEG-mTHPC mediated PDT showed selective destruction of mesothelioma xenografts without causing damage to intrathoracic organs in pigs at similar treatment conditions. The light delivery system afforded regular light distribution to different parts of the chest cavity.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
This report describes results from a study evaluating the use of stringless paving using a combination of global positioning and laser technologies. CMI and Geologic Computer Systems developed this technology and successfully implemented it on construction earthmoving and grading projects. Concrete paving is a new area for considering this technology. Fred Carlson Co. agreed to test the stringless paving technology on two challenging concrete paving projects located in Washington County, Iowa. The evaluation was conducted on two paving projects in Washington County, Iowa, during the summer of 2003. The research team from Iowa State University monitored the guidance and elevation conformance to the original design. They employed a combination of physical depth checks, surface location and elevation surveys, concrete yield checks, and physical survey of the control stakes and string line elevations. A final check on profile of the pavement surface was accomplished by the use of the Iowa Department of Transportation Light Weight Surface Analyzer (LISA). Due to the speed of paving and the rapid changes in terrain, the laser technology was abandoned for this project. Total control of the guidance and elevation controls on the slip-form paver were moved from string line to global positioning systems (GPS). The evaluation was a success, and the results indicate that GPS control is feasible and approaching the desired goals of guidance and profile control with the use of three dimensional design models. Further enhancements are needed in the physical features of the slipform paver oil system controls and in the computer program for controlling elevation.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The reason for this study is to propose a new quantitative approach on how to assess the quality of Open Access University Institutional Repositories. The results of this new approach are tested in the Spanish University Repositories. The assessment method is based in a binary codification of a proposal of features that objectively describes the repositories. The purposes of this method are assessing the quality and an almost automatically system for updating the data of the characteristics. First of all a database was created with the 38 Spanish institutional repositories. The variables of analysis are presented and explained either if they are coming from bibliography or are a set of new variables. Among the characteristics analyzed are the features of the software, the services of the repository, the features of the information system, the Internet visibility and the licenses of use. Results from Spanish universities ARE provided as a practical example of the assessment and for having a picture of the state of the development of the open access movement in Spain.
Resumo:
Käyttäjien tunnistaminen tietojärjestelmissä on ollut yksi tietoturvan kulmakivistä vuosikymmenten ajan. Ajatus käyttäjätunnuksesta ja salasanasta on kaikkein kustannustehokkain ja käytetyin tapa säilyttää luottamus tietojärjestelmän ja käyttäjien välillä. Tietojärjestelmien käyttöönoton alkuaikoina, jolloin yrityksissä oli vain muutamia tietojärjestelmiä ja niitä käyttivät vain pieni ryhmä käyttäjiä, tämä toimintamalli osoittautui toimivaksi. Vuosien mittaan järjestelmien määrä kasvoi ja sen mukana kasvoi salasanojen määrä ja monimuotoisuus. Kukaan ei osannut ennustaa, kuinka paljon salasanoihin liittyviä ongelmia käyttäjät kohtaisivat ja kuinka paljon ne tulisivat ruuhkauttamaan yritysten käyttäjätukea ja minkälaisia tietoturvariskejä salasanat tulisivat aiheuttamaan suurissa yrityksissä. Tässä diplomityössä tarkastelemme salasanojen aiheuttamia ongelmia suuressa, globaalissa yrityksessä. Ongelmia tarkastellaan neljästä eri näkökulmasta; ihmiset, teknologia, tietoturva ja liiketoiminta. Ongelmat osoitetaan esittelemällä tulokset yrityksen työntekijöille tehdystä kyselystä, joka toteutettiin osana tätä diplomityötä. Ratkaisu näihin ongelmiin esitellään keskitetyn salasanojenhallintajärjestelmän muodossa. Järjestelmän eri ominaisuuksia arvioidaan ja kokeilu -tyyppinen toteutus rakennetaan osoittamaan tällaisen järjestelmän toiminnallisuus.
Resumo:
PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.
Resumo:
Virtuaalinen yhteisö voidaan määritellä ihmisten, eritasoisten suhteiden, yhteisen tarkoituksen ja tietoteknisten järjestelmien muodostamaksi kokonaisuudeksi. Järjestelmäratkaisut muodostavat yhteisön toiminnan pelikentän, virtuaaliareenan. Virtuaaliareena on siten erottamaton osa virtuaalista yhteisöä. Tutkielman tarkoituksena on tutkia, millainen on virtuaalisen yhteisön kehittämisprosessi ja elinkaari. Lisäksi tutkielmassa kuvataan menetelmät, joiden avulla yhteisön jäsenten tarpeita on mahdollista kartoittaa toiminnan eri vaiheissa. Näitä menetelmiä voidaan käyttää hyväksi erityyppisten virtuaaliympäristöjen kehitystyössä. Käytännön kehittämistyötä tarkastellaan Vaikuttamo –casen kautta. Tavoitteena oli selvittää, kuinka Vaikuttamoa palveleva virtuaaliareena suunniteltiin ja mitkä tekijät ovat vauhdittaneet Vaikuttamon kehittymistä toimivaksi yhteisöksi. Tutkimusmenetelmänä käytettiin teemahaastattelua ja lisäksi hyödynnettiin toukokuussa 2003 toteutetun web-kyselyn aineistoa. Tulosten perusteella havaittiin, että kirjallisuudessa kuvattu kehittämisprosessi ei olisi soveltunut Vaikuttamon tarpeisiin, koska kyseessä oli kokonaan uudentyyppinen yhteisö. Vaikuttamon menestyksen avaintekijöinä voidaan pitää paikallisuutta, varsinaisen yhteisön ulkopuolista ohjausta sekä vahvoja sidoksia Hämeenlinnan alueen kouluihin ja opettajiin.
Resumo:
The advent of the Internet had a great impact on distance education and rapidly e-learning has become a killer application. Education institutions worldwide are taking advantage of the available technology in order to facilitate education to a growing audience. Everyday, more and more people use e-learning systems, environments and contents for both training and learning. E-learning promotes educationamong people that due to different reasons could not have access to education: people who could nottravel, people with very little free time, or withdisabilities, etc. As e-learning systems grow and more people are accessing them, it is necessary to consider when designing virtual environments the diverse needs and characteristics that different users have. This allows building systems that people can use easily, efficiently and effectively, where the learning process leads to a good user experience and becomes a good learning experience.
Resumo:
Peer-reviewed
Resumo:
Network virtualisation is considerably gaining attentionas a solution to ossification of the Internet. However, thesuccess of network virtualisation will depend in part on how efficientlythe virtual networks utilise substrate network resources.In this paper, we propose a machine learning-based approachto virtual network resource management. We propose to modelthe substrate network as a decentralised system and introducea learning algorithm in each substrate node and substrate link,providing self-organization capabilities. We propose a multiagentlearning algorithm that carries out the substrate network resourcemanagement in a coordinated and decentralised way. The taskof these agents is to use evaluative feedback to learn an optimalpolicy so as to dynamically allocate network resources to virtualnodes and links. The agents ensure that while the virtual networkshave the resources they need at any given time, only the requiredresources are reserved for this purpose. Simulations show thatour dynamic approach significantly improves the virtual networkacceptance ratio and the maximum number of accepted virtualnetwork requests at any time while ensuring that virtual networkquality of service requirements such as packet drop rate andvirtual link delay are not affected.
Resumo:
En la societat d’avui dia, les empreses depenen en gran part dels seus recursos informàtics. La seva capacitat de supervivència i innovació en el mercat actual, on la competitivitat és cada dia més forta, passa per una infraestructura informàtica que els permeti, no només desplegar i implantar ordinadors i servidors de manera ràpida i eficient sinó que també les protegeixi contra parades del sistema informàtic, problemes amb servidors, caigudes o desastres físics de hardware. Per evitar aquests problemes informàtics susceptibles de poder parar el funcionament d’una empresa es va començar a treballar en el camp de la virtualització informàtica amb l’objectiu de poder trobar solucions a aquests problemes a la vegada que s’aprofitaven els recursos de hardware existents d’una manera més òptim a i eficient, reduint així també el cost de la infraestructura informàtica. L’objectiu principal d’aquest treball és veure en primer pla la conversió d’una empresa real amb una infraestructura informàtica del tipus un servidor físic -una funció cap a una infraestructura virtual del tipus un servidor físic -varis servidors virtual -vàries funcions. Analitzarem l’estat actual de l’empresa, servidors i funcions, adquirirem el hardware necessari i farem la conversió de tots els seus servidors cap a una nova infraestructura virtual. Faig especial atenció a les explicacions de perquè utilitzo una opció i no un altre i també procuro sempre donar vàries opcions. Igualment remarco en quadres verds observacions a tenir en compte complementàries al que estic explicant en aquell moment, i en quadres vermells temes en els que s’ha de posar especial atenció en el moment en que es fan. Finalment, un cop feta la conversió, veurem els molts avantatges que ens ha reportat aquesta tecnologia a nivell de fiabilitat, estabilitat, capacitat de tolerància a errades, capacitat de ràpid desplegament de noves màquines, capacitat de recuperació del sistema i aprofitament de recursos físics
Resumo:
In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.