857 resultados para Laptop computers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigates how mobile technology usage could help to bring Information and communication technologies (ICT) to the people in developing countries. Some people in developing countries have access to use ICT while other people do not have such opportunity. This digital divide among people is present in many developing countries where computers and the Internet are difficult to access. The Internet provides information that can increase productivity and enable markets to function more efficiently. The Internet reduces information travel time and provides more efficient ways for firms and workers to operate. ICT and the Internet can provide opportunities for economic growth and productivity in developing countries. This indicates that it is very important to bridge the digital divide and increase Internet connections in developing countries. The purpose of this thesis is to investigate how can mobile technology and mobile services help to bridge the digital divide in developing countries. Theoretical background of this thesis consists of a collection of articles and reports. Theoretical material was gathered by going through literature on the digital divide, mobile technology and mobile application development. The empirical research was conducted by sending a questionnaire by email to a selection of application developers located in developing countries. The questionnaire’s purpose was to gather qualitative information concerning mobile application development in developing countries. This thesis main result suggests that mobile phones and mobile technology usage can help to bridge the digital divide in developing countries. This study finds that mobile technology provides one of the best tools that can help to bridge the digital divide in developing countries. Mobile technology can bring affordable ICT to people who do not have access to use computers. Smartphones can provide Internet connection, mobile services and mobile applications to a rapidly growing number of mobile phone users in developing countries. New low-cost smartphones empower people in developing countries to have access to information through the Internet. Mobile technology has the potential to help to bridge the digital divide in developing countries where a vast amount of people own mobile phones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microsoft System Center Configuration Manager is a systems management product for managing large groups of computers and/or mobile devices. It provides operating system deployment, software distribution, patch management, hardware & software inventory, remote control and many other features for the managed clients. This thesis focuses on researching whether this product is suitable for large, international organization with no previous, centralized solution for managing all such networked devices and detecting areas, where the system can be altered to achieve a more optimal management product from the company’s perspective. The results showed that the system is suitable for such organization if properly configured and clear and transparent line of communication between key IT personnel exists.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wind energy has obtained outstanding expectations due to risks of global warming and nuclear energy production plant accidents. Nowadays, wind farms are often constructed in areas of complex terrain. A potential wind farm location must have the site thoroughly surveyed and the wind climatology analyzed before installing any hardware. Therefore, modeling of Atmospheric Boundary Layer (ABL) flows over complex terrains containing, e.g. hills, forest, and lakes is of great interest in wind energy applications, as it can help in locating and optimizing the wind farms. Numerical modeling of wind flows using Computational Fluid Dynamics (CFD) has become a popular technique during the last few decades. Due to the inherent flow variability and large-scale unsteadiness typical in ABL flows in general and especially over complex terrains, the flow can be difficult to be predicted accurately enough by using the Reynolds-Averaged Navier-Stokes equations (RANS). Large- Eddy Simulation (LES) resolves the largest and thus most important turbulent eddies and models only the small-scale motions which are more universal than the large eddies and thus easier to model. Therefore, LES is expected to be more suitable for this kind of simulations although it is computationally more expensive than the RANS approach. With the fast development of computers and open-source CFD software during the recent years, the application of LES toward atmospheric flow is becoming increasingly common nowadays. The aim of the work is to simulate atmospheric flows over realistic and complex terrains by means of LES. Evaluation of potential in-land wind park locations will be the main application for these simulations. Development of the LES methodology to simulate the atmospheric flows over realistic terrains is reported in the thesis. The work also aims at validating the LES methodology at a real scale. In the thesis, LES are carried out for flow problems ranging from basic channel flows to real atmospheric flows over one of the most recent real-life complex terrain problems, the Bolund hill. All the simulations reported in the thesis are carried out using a new OpenFOAM® -based LES solver. The solver uses the 4th order time-accurate Runge-Kutta scheme and a fractional step method. Moreover, development of the LES methodology includes special attention to two boundary conditions: the upstream (inflow) and wall boundary conditions. The upstream boundary condition is generated by using the so-called recycling technique, in which the instantaneous flow properties are sampled on aplane downstream of the inlet and mapped back to the inlet at each time step. This technique develops the upstream boundary-layer flow together with the inflow turbulence without using any precursor simulation and thus within a single computational domain. The roughness of the terrain surface is modeled by implementing a new wall function into OpenFOAM® during the thesis work. Both, the recycling method and the newly implemented wall function, are validated for the channel flows at relatively high Reynolds number before applying them to the atmospheric flow applications. After validating the LES model over simple flows, the simulations are carried out for atmospheric boundary-layer flows over two types of hills: first, two-dimensional wind-tunnel hill profiles and second, the Bolund hill located in Roskilde Fjord, Denmark. For the twodimensional wind-tunnel hills, the study focuses on the overall flow behavior as a function of the hill slope. Moreover, the simulations are repeated using another wall function suitable for smooth surfaces, which already existed in OpenFOAM® , in order to study the sensitivity of the flow to the surface roughness in ABL flows. The simulated results obtained using the two wall functions are compared against the wind-tunnel measurements. It is shown that LES using the implemented wall function produces overall satisfactory results on the turbulent flow over the two-dimensional hills. The prediction of the flow separation and reattachment-length for the steeper hill is closer to the measurements than the other numerical studies reported in the past for the same hill geometry. The field measurement campaign performed over the Bolund hill provides the most recent field-experiment dataset for the mean flow and the turbulence properties. A number of research groups have simulated the wind flows over the Bolund hill. Due to the challenging features of the hill such as the almost vertical hill slope, it is considered as an ideal experimental test case for validating micro-scale CFD models for wind energy applications. In this work, the simulated results obtained for two wind directions are compared against the field measurements. It is shown that the present LES can reproduce the complex turbulent wind flow structures over a complicated terrain such as the Bolund hill. Especially, the present LES results show the best prediction of the turbulent kinetic energy with an average error of 24.1%, which is a 43% smaller than any other model results reported in the past for the Bolund case. Finally, the validated LES methodology is demonstrated to simulate the wind flow over the existing Muukko wind farm located in South-Eastern Finland. The simulation is carried out only for one wind direction and the results on the instantaneous and time-averaged wind speeds are briefly reported. The demonstration case is followed by discussions on the practical aspects of LES for the wind resource assessment over a realistic inland wind farm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkimuksessa selvitettiin, kuinka hyvä tekoäly tietokonepeliin on mahdollista toteuttaa nykytiedolla ja -tekniikalla. Tekoäly rajattiin tarkoittamaan tekoälyn ohjaamia pelihahmoja. Lisäksi yksinkertaisia tekoälytoteutuksia ei huomioitu. Työ toteutettiin tutustumalla aiheeseen liittyvään kirjallisuuteen sekä kehittäjäyhteisön web-sivustojen tietoon. Hyvän tekoälyn kriteereiksi valikoituivat viihdyttävyys ja uskottavuus. Katsaus suosituimpiin toteuttamistekniikoihin ja tekoälyn mahdollisuuksiin osoitti, että teoriassa hyvinkin edistynyt tekoäly on toteutettavissa. Käytännössä tietokoneen rajalliset resurssit, kehittäjien rajalliset taidot ja pelinkehitysprojektien asettamat vaatimukset näyttävät kuitenkin rajoittavan tekoälyn toteuttamista kaupallisessa tuotteessa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Circadian timing is structured in such a way as to receive information from the external and internal environments, and its function is the timing organization of the physiological and behavioral processes in a circadian pattern. In mammals, the circadian timing system consists of a group of structures, which includes the suprachiasmatic nucleus (SCN), the intergeniculate leaflet and the pineal gland. Neuron groups working as a biological pacemaker are found in the SCN, forming a biological master clock. We present here a simple model for the circadian timing system of mammals, which is able to reproduce two fundamental characteristics of biological rhythms: the endogenous generation of pulses and synchronization with the light-dark cycle. In this model, the biological pacemaker of the SCN was modeled as a set of 1000 homogeneously distributed coupled oscillators with long-range coupling forming a spherical lattice. The characteristics of the oscillator set were defined taking into account the Kuramoto's oscillator dynamics, but we used a new method for estimating the equilibrium order parameter. Simultaneous activities of the excitatory and inhibitory synapses on the elements of the circadian timing circuit at each instant were modeled by specific equations for synaptic events. All simulation programs were written in Fortran 77, compiled and run on PC DOS computers. Our model exhibited responses in agreement with physiological patterns. The values of output frequency of the oscillator system (maximal value of 3.9 Hz) were of the order of magnitude of the firing frequencies recorded in suprachiasmatic neurons of rodents in vivo and in vitro (from 1.8 to 5.4 Hz).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä kandidaatintyössä luodaan kattava katsaus erilaisiin PC-laiteissa toimiviin usean näytön käyttöönottomenetelmiin, joita on olemassa useita ominaisuuksiltaan ja käyttötarkoituksiltaan erilaisia. Työssä perehdytään Windowsin usean näytön tuen historiaan ja sen kehitykseen eri Windows versioiden välillä tuen alkuajoista 1990-luvulta nykyaikaan aina viimeisimpiin Windows käyttöjärjestelmiin asti. Lopuksi tarkastellaan vielä pelien usean näytön tukea ja kuinka hyödyntää useaa näyttöä sellaisissa peleissä, jotka eivät sitä sisäänrakennetusti tue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Personalized medicine will revolutionize our capabilities to combat disease. Working toward this goal, a fundamental task is the deciphering of geneticvariants that are predictive of complex diseases. Modern studies, in the formof genome-wide association studies (GWAS) have afforded researchers with the opportunity to reveal new genotype-phenotype relationships through the extensive scanning of genetic variants. These studies typically contain over half a million genetic features for thousands of individuals. Examining this with methods other than univariate statistics is a challenging task requiring advanced algorithms that are scalable to the genome-wide level. In the future, next-generation sequencing studies (NGS) will contain an even larger number of common and rare variants. Machine learning-based feature selection algorithms have been shown to have the ability to effectively create predictive models for various genotype-phenotype relationships. This work explores the problem of selecting genetic variant subsets that are the most predictive of complex disease phenotypes through various feature selection methodologies, including filter, wrapper and embedded algorithms. The examined machine learning algorithms were demonstrated to not only be effective at predicting the disease phenotypes, but also doing so efficiently through the use of computational shortcuts. While much of the work was able to be run on high-end desktops, some work was further extended so that it could be implemented on parallel computers helping to assure that they will also scale to the NGS data sets. Further, these studies analyzed the relationships between various feature selection methods and demonstrated the need for careful testing when selecting an algorithm. It was shown that there is no universally optimal algorithm for variant selection in GWAS, but rather methodologies need to be selected based on the desired outcome, such as the number of features to be included in the prediction model. It was also demonstrated that without proper model validation, for example using nested cross-validation, the models can result in overly-optimistic prediction accuracies and decreased generalization ability. It is through the implementation and application of machine learning methods that one can extract predictive genotype–phenotype relationships and biological insights from genetic data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digitalization has been predicted to change the future as a growing range of non-routine tasks will be automated, offering new kinds of business models for enterprises. Serviceoriented architecture (SOA) provides a basis for designing and implementing welldefined problems as reusable services, allowing computers to execute them. Serviceoriented design has potential to act as a mediator between IT and human resources, but enterprises struggle with their SOA adoption and lack a linkage between the benefits and costs of services. This thesis studies the phenomenon of service reuse in enterprises, proposing an ontology to link different kinds of services with their role conceptually as a part of the business model. The proposed ontology has been created on the basis of qualitative research conducted in three large enterprises. Service reuse has two roles in enterprises: it enables automated data sharing among human and IT resources, and it may provide cost savings in service development and operations. From a technical viewpoint, the ability to define a business problem as a service is one of the key enablers for achieving service reuse. The research proposes two service identification methods, first to identify prospective services in the existing documentation of the enterprise and secondly to model the services from a functional viewpoint, supporting service identification sessions with business stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many, if not all, aspects of our everyday lives are related to computers and control. Microprocessors and wireless communications are involved in our lives. Embedded systems are an attracting field because they combine three key factors, small size, low power consumption and high computing capabilities. The aim of this thesis is to study how Linux communicates with the hardware, to answer the question if it is possible to use an operating system like Debian for embedded systems and finally, to build a Mechatronic real time application. In the thesis a presentation of Linux and the Xenomai real time patch is given, the bootloader and communication with the hardware is analyzed. BeagleBone the evaluation board is presented along with the application project consisted of a robot cart with a driver circuit, a line sensor reading a black line and two Xbee antennas. It makes use of Xenomai threads, the real time kernel. According to the obtained results, Linux is able to operate as a real time operating system. The issue of future research is the area of embedded Linux is also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Postgraduate seminar series with a title Situational Awareness for Critical Infrastructure Protection held at the Department of Military Technology of the National Defence University in 2015. This book is a collection of some of talks that were presented in the seminar. The papers address designing inter-organizational situation awareness system, principles of designing for situation awareness, situation awareness in distributed teams, vulnerability analysis in a critical system context, tactical Command, Control, Communications, Computers, & Intelligence (C4I) systems, and improving situational awareness in the circle of trust. This set of papers tries to give some insight to current issues of the situation awareness for critical infrastructure protection. The seminar has always made a publication of the papers but this has been an internal publication of the Finnish Defence Forces and has not hindered publication of the papers in international conferences. Publication of these papers in peer reviewed conferences has indeed been always the goal of the seminar, since it teaches writing conference level papers. We still hope that an internal publication in the department series is useful to the Finnish Defence Forces by offering an easy access to these papers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since its discovery, chaos has been a very interesting and challenging topic of research. Many great minds spent their entire lives trying to give some rules to it. Nowadays, thanks to the research of last century and the advent of computers, it is possible to predict chaotic phenomena of nature for a certain limited amount of time. The aim of this study is to present a recently discovered method for the parameter estimation of the chaotic dynamical system models via the correlation integral likelihood, and give some hints for a more optimized use of it, together with a possible application to the industry. The main part of our study concerned two chaotic attractors whose general behaviour is diff erent, in order to capture eventual di fferences in the results. In the various simulations that we performed, the initial conditions have been changed in a quite exhaustive way. The results obtained show that, under certain conditions, this method works very well in all the case. In particular, it came out that the most important aspect is to be very careful while creating the training set and the empirical likelihood, since a lack of information in this part of the procedure leads to low quality results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern automobiles are no longer just mechanical tools. The electronics and computing services they are shipping with are making them not less than a computer. They are massive kinetic devices with sophisticated computing power. Most of the modern vehicles are made with the added connectivity in mind which may be vulnerable to outside attack. Researchers have shown that it is possible to infiltrate into a vehicle’s internal system remotely and control the physical entities such as steering and brakes. It is quite possible to experience such attacks on a moving vehicle and unable to use the controls. These massive connected computers can be life threatening as they are related to everyday lifestyle. First part of this research studied the attack surfaces in the automotive cybersecurity domain. It also illustrated the attack methods and capabilities of the damages. Online survey has been deployed as data collection tool to learn about the consumers’ usage of such vulnerable automotive services. The second part of the research portrayed the consumers’ privacy in automotive world. It has been found that almost hundred percent of modern vehicles has the capabilities to send vehicle diagnostic data as well as user generated data to their manufacturers, and almost thirty five percent automotive companies are collecting them already. Internet privacy has been studies before in many related domain but no privacy scale were matched for automotive consumers. It created the research gap and motivation for this thesis. A study has been performed to use well established consumers privacy scale – IUIPC to match with the automotive consumers’ privacy situation. Hypotheses were developed based on the IUIPC model for internet consumers’ privacy and they were studied by the finding from the data collection methods. Based on the key findings of the research, all the hypotheses were accepted and hence it is found that automotive consumers’ privacy did follow the IUIPC model under certain conditions. It is also found that a majority of automotive consumers use the services and devices that are vulnerable and prone to cyber-attacks. It is also established that there is a market for automotive cybersecurity services and consumers are willing to pay certain fees to avail that.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työssä tutkittiin kirjallisuustyönä akkuteknologian nykytilaa ja markkinoita kulutuselektroniikan osalta. Työssä tehtiin myös katsaus potentiaalisiin tulevaisuuden akkuteknologioihin. Työssä havaittiin, että kulutuselektroniikassa ainoat suuresti käytetyt akkutyypit ovat nikkelimetallihybridi- (NiMH) ja litiumioniakut (Li-ion). Tärkeimpänä ominaisuutena kulutuselektroniikassa akuilla yleensä pidetään kapasiteettia, jossa Li-ion akut ovat selvästi parempia jopa kaksinkertaisen energiatiheyden takia. Li-ion akuilla voidaan saavuttaa myös moninkertainen käyttöikä lataussykleinä ja moninkertainen purkausvirta, riippuen käytetystä katodimateriaalista. NiMH akuilla etuna on lähinnä halvempi hinta ja parempi turvallisuus. Toisaalta myös pieni jännite voidaan laskea hyväksi puoleksi, koska NiMH akuilla voidaan korvata kertakäyttöisiä alkaliparistoja. Vuonna 2012 Li-ion akkuja myytiin kapasiteetissa mitattuna jopa kahdeksan kertaa enemmän kuin NiMH akkuja ja myyntimäärien ennustetaan myös kasvavan tulevaisuudessa. Liion akkujen myyntimääristä suurin osa oli kulutuselektroniikan käyttökohteisiin ja jopa kaksi kolmasosaa oli kannettavien tietokoneiden ja kännyköiden akkuja. Uusia akkuteknologioita ja Li-ion akkujen parannuksia on paljon kehitteillä, mutta suurimman potentiaalin ja myös suuret ongelmat kaupallistumiseen omaa litium-ilma akut. Lyhyemmällä aikavälillä potentiaalisia teknologioita ovat litium-rikki akut, sekä nykyisiin Li-ion akkuihin kehitteillä olevat anodimateriaalit kuten esim. pii ja alumiini/titaani, joiden ongelmiin on löydetty ratkaisuja nanoteknologiasta.