893 resultados para OpenFlow, SDN, Software-Defined Networking, Cloud


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Past research has suggested that social engineering poses the most significant security risk. Recent studies have suggested that social networking sites (SNSs) are the most common source of social engineering attacks. The risk of social engineering attacks in SNSs is associated with the difficulty of making accurate judgments regarding source credibility in the virtual environment of SNSs. In this paper, we quantitatively investigate source credibility dimensions in terms of social engineering on Facebook, as well as the source characteristics that influence Facebook users to judge an attacker as credible, therefore making them susceptible to victimization. Moreover, in order to predict users’ susceptibility to social engineering victimization based on their demographics, we investigate the effectiveness of source characteristics on different demographic groups by measuring the consent intentions and behavior responses of users to social engineering requests using a role-play experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report here on a series of laboratory experiments on plumes, undertaken with the object of simulating the effect of the heat release that occurs in clouds on condensation of water vapor. The experimental technique used for this purpose relies on ohmic heating generated in an electrically conducting plume fluid subjected to a suitable alternating voltage across specified axial stations in the plume flow [Bhat et al., 1989]. The present series of experiments achieves a value of the Richardson number that is toward the lower end of the range that characteristics cumulus clouds. It is found that the buoyancy enhancement due to heating disrupts the eddy structures in the flow and reduces the dilution owing to entrainment of ambient fluid that would otherwise have occurred in the central region of the plume. Heating also reduces the spread rate of the plume, but as it accelerates the flow as well, the overall specific mass flux in the plume does not show a very significant change at the heat input employed in the experiment. However, there is some indication that the entrainment rate (proportional to the streamwise derivative of the mass flux) is slightly higher immediately after heat injection and slightly lower farther downstream. The measurements support a previous proposal for a cloud scenario [Bhat and Narasimha, 1996] and demonstrate how fresh insights into certain aspects of the fluid dynamics of clouds may be derived from the experimental techniques employed here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bug fixing is a highly cooperative work activity where developers, testers, product managers and other stake-holders collaborate using a bug tracking system. In the context of Global Software Development (GSD), where software development is distributed across different geographical locations, we focus on understanding the role of bug trackers in supporting software bug fixing activities. We carried out a small-scale ethnographic fieldwork in a software product team distributed between Finland and India at a multinational engineering company. Using semi-structured interviews and in-situ observations of 16 bug cases, we show that the bug tracker 1) supported information needs of different stake holder, 2) established common-ground, and 3) reinforced issues related to ownership, performance and power. Consequently, we provide implications for design around these findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aerosols from biomass burning can alter the radiative balance of the Earth by reflecting and absorbing solar radiation(1). Whether aerosols exert a net cooling or a net warming effect will depend on the aerosol type and the albedo of the underlying surface(2). Here, we use a satellite-based approach to quantify the direct, top-of-atmosphere radiative effect of aerosol layers advected over the partly cloudy boundary layer of the southeastern Atlantic Ocean during July-October of 2006 and 2007. We show that the warming effect of aerosols increases with underlying cloud coverage. This relationship is nearly linear, making it possible to define a critical cloud fraction at which the aerosols switch from exerting a net cooling to a net warming effect. For this region and time period, the critical cloud fraction is about 0.4, and is strongly sensitive to the amount of solar radiation the aerosols absorb and the albedo of the underlying clouds. We estimate that the regional-mean warming effect of aerosols is three times higher when large-scale spatial covariation between cloud cover and aerosols is taken into account. These results demonstrate the importance of cloud prediction for the accurate quantification of aerosol direct effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

First year medical laboratory science students (up to 120) undertake a group e-poster project, based in a blended learning model Google Drive, encompassing Google’s cloud computing software, provides a readily accessible, transparent online space for students to collaborate with each other and realise tangible outcomes from their learning The Cube provides an inspiring digital learning display space for student ‘conference style’ presentations

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkielma käsittelee suomalaisten televisiotekstittäjien ammatillisuutta, käännösprosessia ja digitaalisten tekstitysohjelmien vaikutuksia tekstitysprosessiin ammattitekstittäjien näkökulmasta. Suomen television digitalisoituminen on aiheuttanut mullistuksia myös tekstitysalalla kun tekstitettävä kuvamateriaali on ryhdytty toimittamaan käännöstoimistoille ja tekstittäjille digitaalisena. Teoriaosuudessa käsitellään käännös- ja tekstitystutkimusta sekä koulutusta Suomessa, ammattitaitoa ja ammatillisuutta sekä kääntämisen apukeinoja. Tekstittäminen esitellään erikoistuneena kääntämisen muotona. On kuitenkin myös huomioitava, että kääntäminen on yksi vaihe tekstitysprosessissa. Teoriaosuus päättyy suomalaisten televisiotekstittäjien arjen ja työkentän nykytilanteen käsittelyyn – tekstittäjät työskentelevät monenlaisilla työehdoilla ja laadun kriteerit saatetaan joutua arvioimaan uudelleen. Empiirisen osan alussa esitetään, että suomalaisia televisiotekstittäjiä on haastateltu yllättävän vähän, ja Jääskeläisen ajatuksiin nojaten mainitaan, että tekstittämisen alalla on vielä paljon tutkimatta – etenkin suomalaisesta tekstitysprosessista löytyy tutkittavaa. Tutkimuskohde on ammatikseen televisioon tekstityksiä tekevät kääntäjät. Suomalaiselle tekstitykseen erikoistuneelle käännöstoimistolle työskenteleville tekstittäjille lähetettiin alkutalvesta 2008 kyselylomake, jolla kartoitettiin sekä monivalintakysymyksillä että avoimilla kysymyksillä heidän ammatillisuuttaan, työmenetelmiään, käännös- ja tekstitysprosessiaan, ammattiylpeyttään ja -identiteettiään, ajanhallintaansa, sekä heidän käyttämäänsä digitaalista tekstitysohjelmaa. Tutkimuksessa kävi ilmi, että lähes kolmanneksella vastaajista on ammatistaan neutraali tai jopa negatiivinen käsitys. Näitä tekstittäjiä yhdistää se seikka, että kaikilla on alle 5 vuotta kokemusta alalta. Valtaosa vastanneista on kuitenkin ylpeitä siitä, että toimivat suomen kielen ammattilaisina. Tekstitysprosessi oli lomakkeessa jaettu esikatseluvaiheeseen, käännösvaiheeseen, ajastamisvaiheeseen ja korjauskatseluvaiheeseen. Tekstittäjät pyydettiin mm. arvioimaan tekstitysprosessinsa kokonaiskestoa. Kestoissa ilmeni suuria eroavaisuuksia, joista ainakin osa korreloi kokemuksen kanssa. Runsas puolet vastaajista on hankkinut digitaalisen tekstitysohjelmiston käyttöönsä ja osa ajastaa edelleen käännöstoimistossa muun muassa ohjelmiston kalleuden vuoksi. Digitaalisen ohjelmiston myötä tekstitysprosessiin ja työkäytänteisiin on tullut muutoksia, kun videonauhureista ja televisioista on siirrytty pelkän tietokoneen käyttöön. On mahdollista tehdä etätyötä kaukomailta käsin, kääntää ja ajastaa lomittain tai tehdä esiajastus ja kääntää sitten. Digitaalinen tekniikka on siis mahdollistanut tekstitysprosessin muuttumisen ja vaihtoehtoiset työmenetelmät, mutta kaikista menetelmistä ei välttämättä ole tekstittäjälle hyötyä. Perinteinen tekstitysprosessi (esikatselu, repliikkijakojen merkitseminen käsikirjoitukseen, kääntäminen ja repliikkien laadinta, korjaukset ja tarkastuskatselu) vaikuttaa edelleen tehokkaimmalta. Vaikka työkäytänteet eroavat toisistaan, kokonaiskäsitys on se, että digitalisoitumisen alkukangertelujen jälkeen tekstittäjien työskentely on tehostunut.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Twelve strains of Pseudomonas pseudomallei were isolated from the soil and water of a sheep paddock over a two-year period. The organism was recovered from the clay layer of the soil profile as well as from water that seeps into this layer during the "wet" season. Five isolates were obtained before the commencement of the "wet" season; environmental factors appear to play an important role in the survival of Ps. pseudomallei during the "dry" season. Lower isolation rates were recorded than those indicated by workers in southeast Asia and Iran.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Fraunhoffer diffraction analysis of cloud-covered satellite imagery has shown that the diffraction pattern follows approximately cosine squared distribution. The overshooting tops of clouds and the shadows cast by them contribute much to the diffraction of light, particularly in the high-frequency range. Indeed, cloud-covered imagery can be distinguished from cloud-free imagery on the basis of rate of decay of the diffracted light power in the high-frequency band.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow Results: A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusions: A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application with source code is freely available for academic users and can be downloaded from http://www.icrisat.org/bt-software-d-lims.htm

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines the leadership skills in municipal organisation. The study reflects the manager views on leadership skills required. The purpose of this study was to reflect the most important leadership skills currently and in the future as well as the control of these skills. The study also examines the importance of the change and development needs of the leadership skills. In addition, the effect of background variables on evaluation of leadership skills were also examined. The quantitative research method was used in the study. The material was collected with the structured questionnaire from 324 Kotka city managers. SPSS-program was used to analyse the study material. Factor analysis was used as the main method for analysis. In addition, mean and standard deviations were used to better reflect the study results. Based on the study results, the most important leadership skills currently and in the future are associated with internet skills, work control, problem solving and human resource management skills. Managers expected the importance of leadership skills to grow in the future. Main growth is associated with the software utilisation, language skills, communication skills as well as financial leadership skills. Strongest competence according to managers is associated with the internet skills. Managers also considered to control well the skills related to employee know-how and manager networking. In addition, significant development needs are required in leadership skills. Main improvement areas were discovered in software utilisation, work control, human resource management skills as well as skills requiring problem solving. It should be noted that the main improvement areas appeared in the leadership skills that were evaluated as most important apart from software utilisation. Position, municipal segments and sex were observed to explain most of the deviation in received responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to find factors that could predict educational dropout. Dropout risk was assessed against pupil’s cognitive competence, success in school, and personal beliefs regarding self and parents, while taking into account the pupil’s background and gender. Based on earlier research, an assumption was made that a pupil’s gender, success in school, and parent’s education would be related with dropping out. This study is part of a project funded by the Academy of Finland and led by Professor Jarkko Hautamäki. The project aims to use longitudinal study to assess the development of pupils’ skills in learning to learn. The target group of this study consisted all Finnish speaking ninth graders of a municipality in Southern Finland. There were in total 1534 pupils, of which 809 were girls and 725 boys. The assessment of learning to learn skills was performed about ninth graders in spring 2004. “Opiopi” test material was used in the assessment, consisting of cognitive tests and questions measuring beliefs. At the same time, pupils’ background information was collected together with their self-reported average grade of all school subjects. During spring 2009, the pupils’ joint application data from years 2004 and 2005 was collected from the Finnish joint application registers. The data were analyzed using quantitative methods assisted by the SPSS for Windows computer software. Analysis was conducted through statistical indices, differences in grade averages, multilevel model, multivariate analysis of variance, and logistic regression analysis. Based on earlier research, dropouts were defined as pupils that had not been admitted to or had not applied to second degree education under the joint application system. Using this definition, 157 students in the target group were classified as dropouts (10 % of the target group): 88 girls and 69 boys. The study showed that the school does not affect the drop-out risk but the school class explains 7,5 % of variation in dropout risk. Among girls, dropping out is predicted by a poor average grade, a lack of beliefs supporting learning, and an unrealistic primary choice in joint application system compared to one’s success in school. Among boys, a poor average grade, unrealistic choices in joint application system, and the belief of parent’s low appreciation of education were related to dropout risk. Keywords educational exclusion, school dropout, success in school, comprehensive school, learning to learn

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A concept of god is a wholeness that an individual experiences as God. The Christian concept of god is based on triune God: Father, Son, and the Holy Spirit. The concept of god is examined in different kinds of contexts particularly between the 1940's and the 1970's. Many researches of school books have widely been made in Finland, but, however, only a few from the point of view of the concept of god. Considering this, the concept of god in the school books of Evangelical Lutheran and Orthodox religion from first to fourth grade in 1970–80 and 2000 is examined in this survey. Additionally, the concept of god in curricula between years 1970 and 2004 is studied. The perspective on the concept of god is the change in the course of time and denominational emphasis. As a first hypothesis, God the Father is represented in books in 21st century as a kind and loving figure. As a second hypothesis, the Trinity and the Holy Spirit get more space in Orthodox books comparing with the Lutheran books. Twelve school books of Evangelical Lutheran and Orthodox religion from first to fourth grade were used as a research material. The books were from four different series between the years 1978 and 2005. Teacher's guidebooks and student's exercise books were left outside of this survey. The research material was analyzed by using abductive content analysis and methodological triangulation. This study included both qualitative and quantitative aspects. The classification system which defined the classifying of concept of god from the research material was consisted of the basis of research material, former reseach, and subtext of used theories. The number of mentions in concept of god was higher in books from the 21st century. In Lutheran books, the change was seen as a growth of the category of God the Father. In Orthodox books, the trend was opposite: the category of Jesus the Son had grown. Differing from the presupposition, the features of loving God in new books had less emphasis than in older books in both churces. The mentions of the Holy Spirit and Trinity were marginal. In the Orthodox books, the categories were bigger, as it was presupposed. It could be seen, that the books confirmed the legalistic period of the concept of god on 3rd and 4th grades. The mentions of concept of god in curriculas have diminished and generalized. The diminution was seen most radically in the curriculum from the year 1994. The results tell something about social changes and views of innovation in curricula. In books the change was not perceived that bright. The idea of the concept of god getting shrank and decreased during the time can be refused.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models are abstractions of reality that have predetermined limits (often not consciously thought through) on what problem domains the models can be used to explore. These limits are determined by the range of observed data used to construct and validate the model. However, it is important to remember that operating the model beyond these limits, one of the reasons for building the model in the first place, potentially brings unwanted behaviour and thus reduces the usefulness of the model. Our experience with the Agricultural Production Systems Simulator (APSIM), a farming systems model, has led us to adapt techniques from the disciplines of modelling and software development to create a model development process. This process is simple, easy to follow, and brings a much higher level of stability to the development effort, which then delivers a much more useful model. A major part of the process relies on having a range of detailed model tests (unit, simulation, sensibility, validation) that exercise a model at various levels (sub-model, model and simulation). To underline the usefulness of testing, we examine several case studies where simulated output can be compared with simple relationships. For example, output is compared with crop water use efficiency relationships gleaned from the literature to check that the model reproduces the expected function. Similarly, another case study attempts to reproduce generalised hydrological relationships found in the literature. This paper then describes a simple model development process (using version control, automated testing and differencing tools), that will enhance the reliability and usefulness of a model.