857 resultados para Laptop computers
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
Des de fa una dècada es parla de la convergència digital, que ha propiciat la conjunció de la informàtica amb els mitjans de comunicació i la interconnexió en xarxa. Actualment circulen amb facilitat suports vells i nous cada vegada més flexibles. La conseqüència per als usuaris és que avui disposen d’una varietat àmplia de continguts connectats permanentment en qualsevol lloc i en qualsevol moment, a través de diverses plataformes i amb una convivència rica i complexa. En aquest context, aquest article mostra les conclusions d’un estudi de camp sobre l’ús, el consum i les preferències de suports i de continguts de la comunicació digital per part de grups d’infants, de joves, d’adults i de gent gran a Catalunya.
Resumo:
Schizophrenia is often considered as a dysconnection syndrome in which, abnormal interactions between large-scale functional brain networks result in cognitive and perceptual deficits. In this article we apply the graph theoretic measures to brain functional networks based on the resting EEGs of fourteen schizophrenic patients in comparison with those of fourteen matched control subjects. The networks were extracted from common-average-referenced EEG time-series through partial and unpartial cross-correlation methods. Unpartial correlation detects functional connectivity based on direct and/or indirect links, while partial correlation allows one to ignore indirect links. We quantified the network properties with the graph metrics, including mall-worldness, vulnerability, modularity, assortativity, and synchronizability. The schizophrenic patients showed method-specific and frequency-specific changes especially pronounced for modularity, assortativity, and synchronizability measures. However, the differences between schizophrenia patients and normal controls in terms of graph theory metrics were stronger for the unpartial correlation method.
Resumo:
Millennials generation is changing the way of learning, prompting educational institutions to attempt to better adapt to young needs by incorporating technologies into education. Based on this premise, we have reviewed the prominent reports of the integration of ICT into education with the aim of evidencing how education is changing, and will change, to meet the needs ofMillennials with ICT support. We conclude that most of the investments have simply resulted in an increase of computers and access to the Internet, with teachers reproducing traditional approaches to education and e-learning being seen as complementary to face-to-face education. While it would seem that the use of ICT is not revolutionizing learning, it is facilitating the personalization, collaboration and ubiquity of learning.
Resumo:
The Information Society has provided the context for the development of a new generation, known as the Millennials, who are characterized by their intensive use of technologies in everyday life. These features are changing the way of learning, prompting educational institutions to attempt to better adapt to youngneeds by incorporating technologies into education. Based on this premise, wehave reviewed the prominent reports of the integration of ICT into education atdifferent levels with the aim of evidencing how education is changing, and willchange, to meet the needs of Millennials with ICT support. The results show thatmost of the investments have simply resulted in an increase of computers andaccess to the Internet, with teachers reproducing traditional approaches to education and e-learning being seen as complementary to face-to-face education.While it would seem that the use of ICT is not revolutionizing learning, it isfacilitating the personalization, collaboration and ubiquity of learning.
Resumo:
El objetivo de este trabajo es implementar una herramienta segura, intuitiva y útil de dispositivo electrónico móvil para que un alumno interactúe con el entorno educativo virtual creado por el Campus de Trabajo UOC. La herramienta ayuda al alumno a trabajar con la UOC sin necesidad de conectarse directamente desde un portátil, desde cualquier lugar, siempre y cuando, se tenga conexión a Internet. El TFG está enfocado a desarrollarse en el entorno de Windows Phone.
Resumo:
The Iowa DOT has been using the AASHTO Present Serviceability Index (PSI) rating procedure since 1968 to rate the condition of pavement sections. A ride factor and a cracking and patching factor make up the PSI value. Crack and patch surveys have been done by sending crews out to measure and record the distress. Advances in video equipment and computers make it practical to videotape roads and do the crack and patch measurements in the office. The objective of the study was to determine the feasibility of converting the crack and patch survey operation to a video recording system with manual post processing. The summary and conclusions are as follows: Video crack and patch surveying is a feasible alternative to the current crack and patch procedure. The cost per mile should be about 25 percent less than the current procedure. More importantly, the risk of accidents is reduced by getting the people and vehicles off the roadway and shoulder. Another benefit is the elimination of the negative public perceptions of the survey crew on the shoulder.
Resumo:
Currently, individuals including designers, contractors, and owners learn about the project requirements by studying a combination of paper and electronic copies of the construction documents including the drawings, specifications (standard and supplemental), road and bridge standard drawings, design criteria, contracts, addenda, and change orders. This can be a tedious process since one needs to go back and forth between the various documents (paper or electronic) to obtain information about the entire project. Object-oriented computer-aided design (OO-CAD) is an innovative technology that can bring a change to this process by graphical portrayal of information. OO-CAD allows users to point and click on portions of an object-oriented drawing that are then linked to relevant databases of information (e.g., specifications, procurement status, and shop drawings). The vision of this study is to turn paper-based design standards and construction specifications into an object-oriented design and specification (OODAS) system or a visual electronic reference library (ERL). Individuals can use the system through a handheld wireless book-size laptop that includes all of the necessary software for operating in a 3D environment. All parties involved in transportation projects can access all of the standards and requirements simultaneously using a 3D graphical interface. By using this system, users will have all of the design elements and all of the specifications readily available without concerns of omissions. A prototype object-oriented model was created and demonstrated to potential users representing counties, cities, and the state. Findings suggest that a system like this could improve productivity to find information by as much as 75% and provide a greater sense of confidence that all relevant information had been identified. It was also apparent that this system would be used by more people in construction than in design. There was also concern related to the cost to develop and maintain the complete system. The future direction should focus on a project-based system that can help the contractors and DOT inspectors find information (e.g., road standards, specifications, instructional memorandums) more rapidly as it pertains to a specific project.
Resumo:
OBJECTIVE: To test the effect of a multidimensional lifestyle intervention on aerobic fitness and adiposity in predominantly migrant preschool children. DESIGN: Cluster randomised controlled single blinded trial (Ballabeina study) over one school year; randomisation was performed after stratification for linguistic region. SETTING: 40 preschool classes in areas with a high migrant population in the German and French speaking regions of Switzerland. PARTICIPANTS: 652 of the 727 preschool children had informed consent and were present for baseline measures (mean age 5.1 years (SD 0.7), 72% migrants of multicultural origins). No children withdrew, but 26 moved away. INTERVENTION: The multidimensional culturally tailored lifestyle intervention included a physical activity programme, lessons on nutrition, media use (use of television and computers), and sleep and adaptation of the built environment of the preschool class. It lasted from August 2008 to June 2009. MAIN OUTCOME MEASURES: Primary outcomes were aerobic fitness (20 m shuttle run test) and body mass index (BMI). Secondary outcomes included motor agility, balance, percentage body fat, waist circumference, physical activity, eating habits, media use, sleep, psychological health, and cognitive abilities. RESULTS: Compared with controls, children in the intervention group had an increase in aerobic fitness at the end of the intervention (adjusted mean difference: 0.32 stages (95% confidence interval 0.07 to 0.57; P=0.01) but no difference in BMI (-0.07 kg/m(2), -0.19 to 0.06; P=0.31). Relative to controls, children in the intervention group had beneficial effects in motor agility (-0.54 s, -0.90 to -0.17; P=0.004), percentage body fat (-1.1%, -2.0 to -0.2; P=0.02), and waist circumference (-1.0 cm, -1.6 to -0.4; P=0.001). There were also significant benefits in the intervention group in reported physical activity, media use, and eating habits, but not in the remaining secondary outcomes. CONCLUSIONS: A multidimensional intervention increased aerobic fitness and reduced body fat but not BMI in predominantly migrant preschool children.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
We have constructed a forward modelling code in Matlab, capable of handling several commonly used electrical and electromagnetic methods in a 1D environment. We review the implemented electromagnetic field equations for grounded wires, frequency and transient soundings and present new solutions in the case of a non-magnetic first layer. The CR1Dmod code evaluates the Hankel transforms occurring in the field equations using either the Fast Hankel Transform based on digital filter theory, or a numerical integration scheme applied between the zeros of the Bessel function. A graphical user interface allows easy construction of 1D models and control of the parameters. Modelling results are in agreement with other authors, but the time of computation is less efficient than other available codes. Nevertheless, the CR1Dmod routine handles complex resistivities and offers solutions based on the full EM-equations as well as the quasi-static approximation. Thus, modelling of effects based on changes in the magnetic permeability and the permittivity is also possible.
Resumo:
Nanotechnology is becoming part of our daily life in a wide range of products such as computers, bicycles, sunscreens or nanomedicines. While these applications already become reality, considerable work awaits scientists, engineers, and policy makers, who want such nanotechnological products to yield a maximum of benefit at a minimum of social, environmental, economic and (occupational) health cost. Considerable efforts for coordination and collaboration in research are needed if one wants to reach these goals in a reasonable time frame and an affordable price tag. This is recognized in Europe by the European Commission which funds not only research projects but also supports the coordination of research efforts. One of these coordination efforts is NanoImpactNet, a researcher-operated network, which started in 2008 promote scientific cross-talk across all disciplines on the health and environmental impact of nanomaterials. Stakeholders contribute to these activities, notably the definition of research and knowledge needs. Initial discussions in this domain focused on finding an agreement on common metrics, and which elements are needed for standardized approaches for hazard and exposure identification. There are many nanomaterial properties that may play a role. Hence, to gain the time needed to study this complex matter full of uncertainties, researchers and stakeholders unanimously called for simple, easy and fast risk assessment tools that can support decision making in this rapidly moving and growing domain. Today, several projects are starting or already running that will develop such assessment tools. At the same time, other projects investigate in depth which factors and material properties can lead to unwanted toxicity or exposure, what mechanisms are involved and how such responses can be predicted and modelled. A vision for the future is that once these factors, properties and mechanisms are understood, they can and will be accounted for in the development of new products and production processes following the idea of "Safety by Design". The promise of all these efforts is a future with nanomaterials where most of their risks are recognized and addressed before they even reach the market.
Resumo:
This paper describes preliminary results of a qualitative case study on mobile communication conducted in an elders¿ retirement home in Toronto (Ontario, Canada) in May 2012. This is part of an international research project on the relationship between mobile communications and older people.Secondary data at a Canadian level contextualizes the case study. We focus ondemographic characteristics and on adoption and use of information and communication technologies (ICTs) broken by age.Participants in the study (21 individuals) are between 75 and 98 years of age, thereforewe can consider that the gathered evidence refers to the ¿old¿ older. Mobile phoneusers in the sample describe very specific uses of the mobile phone, while non-usersreport not facing external pressures for adopting that technology. The main channel formediated communication is the landline; in consequences mobile phones ¿when used¿ constitute an extra layer of communication. Finally, when members of the personal network of the individuals live abroad they are more prone to use Internet and Skype. We are also able to find ex-users of both mobile telephony and computers/internet who stopped using these technologies because they did not find any use for them.
Resumo:
Financial information is extremely sensitive. Hence, electronic banking must provide a robust system to authenticate its customers and let them access their data remotely. On the other hand, such system must be usable, affordable, and portable.We propose a challengeresponse based one-time password (OTP) scheme that uses symmetriccryptography in combination with a hardware security module. The proposed protocol safeguards passwords from keyloggers and phishing attacks.Besides, this solution provides convenient mobility for users who want to bank online anytime and anywhere, not just from their owntrusted computers.