763 resultados para Computer networks -- Security measures


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Finnish electricity distribution sector, rural areas in particular, is facing major challenges because of the economic regulation, tightening supply security requirements and the ageing network asset. Therefore, the target in the distribution network planning and asset management is to develop and renovate the networks to meet these challenges in compliance with the regulations in an economically feasible way. Concerning supply security, the new Finnish Electricity Market Act limits the maximum duration of electricity supply interruptions to six hours in urban areas and 36 hours in rural areas. This has a significant impact on distribution network planning, especially in rural areas where the distribution networks typically require extensive modifications and renovations to meet the supply security requirements. This doctoral thesis introduces a methodology to analyse electricity distribution system development. The methodology is based on and combines elements of reliability analysis, asset management and economic regulation. The analysis results can be applied, for instance, to evaluate the development of distribution reliability and to consider actions to meet the tightening regulatory requirements. Thus, the methodology produces information for strategic decision-making so that DSOs can respond to challenges arising in the electricity distribution sector. The key contributions of the thesis are a network renovation concept for rural areas, an analysis to assess supply security, and an evaluation of the effects of economic regulation on the strategic network planning. In addition, the thesis demonstrates how the reliability aspect affects the placement of automation devices and how the reserve power can be arranged in a rural area network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

TIIVISTELMÄ Tekijä: Jouni Pyykkö Työn nimi: Sähkönjakeluverkkoyhtiöiden kokemuksia liiketoiminnan taloudellisesta sääntelystä kolmannella valvontajaksolla 2012 - 2015 Vuosi: 2014 Paikka: Hyvinkää Diplomityö. Lappeenrannan teknillinen yliopisto, Tuotantotalous. 115 sivua, 9 kuvaa, 2 taulukkoa Tarkastajat: Professori Timo Pihkala ja Tutkijatohtori Marita Rautiainen Hakusanat: Jakeluverkkoyhtiö, regulaatio ja sääntely Euroopan unionin 1990 -luvulla aloittama sähkömarkkinoiden kehittäminen on merkinnyt Suomessakin suuria muutoksia toimialan yrityksille. Sähköenergian kauppa ja tuotanto on saatettu vapaan markkinaehtoisen kilpailun piriin. Sen sijaan sähkönsiirron ja -jakelun liiketoiminnot on toteutettu alueellisten luonnollisten monopolien turvaamina liiketoimintamalleina. Monopolin kautta toteutettu liiketoiminta aiheuttaa merkittäviä eturistiriitoja sähkönjakeluverkkoliiketoiminnan sidosryhmille. Nämä ristiriidat voivat johtaa kuluttajan kannalta kohtuuttomaan sähkönsiirron hinnoitteluun. Toisaalta liiketoiminnan on turvattava jakeluverkkoyhtiölle riittävät kannusteet liiketoiminnan harjoittamiseen sekä turvata riittävät investointitasot sähköverkkoihin sähkönlaadun ja toimitusvarmuuden turvaamiseksi. Näiden eri sähkönjakeluliiketoiminnan sidosryhmien tarpeiden tasapainottamiseen käytetään Suomessa energiamarkkinaviranomaisen toteuttamaa verkkoyhtiöiden sääntelyä eli regulaatiota. Sääntelyn tavoitteena on turvata jakeluverkkoliiketoiminnan osapuolten ristiriitaiset edut niin, että kokonaisuuden aiheuttama hyvinvointitappio on mahdollisimman pieni. Tässä työssä tarkastellaan sääntelyn toimivuutta ja ongelmakohtia jakeluverkkoyhtiöiden liiketoiminnan näkökulmasta. Tutkimus on toteutettu haastattelemalla yhdeksää jakeluverkkoyhtiön edustajaa. Tavoitteena on löytää jakeluverkkoyhtiöiden näkökulmasta esiin nousevat sääntelyyn liittyvät ongelmat ja liiketoiminnan haasteet. Tutkielmassa luodaan myös kuvaus keskeisistä valvontamenetelmistä sekä toimenpiteistä, joilla jakeluverkkoyhtiöiden näkökulmasta sääntelymallia voisi parantaa ja tehostaa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the framework of state security policy, the focus of this dissertation are the relations between how new security threats are perceived and the policy planning and bureaucratic implementation that are designed to address them. In addition, this thesis explores and studies some of the inertias that might exist in the core of the state apparatus as it addresses new threats and how these could be better managed. The dissertation is built on five thematic and interrelated articles highlighting different aspects of when new significant national security threats are detected by different governments until the threats on the policy planning side translate into protective measures within the society. The timeline differs widely between different countries and some key aspects of this process are also studied. One focus concerns mechanisms for adaptability within the Intelligence Community, another on the policy planning process within the Cabinet Offices/National Security Councils and the third focus is on the planning process and how policy is implemented within the bureaucracy. The issue of policy transfer is also analysed, revealing that there is some imitation of innovation within governmental structures and policies, for example within the field of cyber defence. The main findings of the dissertation are that this context has built-in inertias and bureaucratic seams found in most government bureaucratic machineries. As much of the information and planning measures imply security classification of the transparency and internal debate on these issues, alternative assessments become limited. To remedy this situation, the thesis recommends ways to improve the decision-making system in order to streamline the processes involved in making these decisions. Another special focus of the thesis concerns the role of the public policy think tanks in the United States as an instrument of change in the country’s national security decision-making environment, which is viewed from the perspective as being a possible source of new ideas and innovation. The findings in this part are based on unique interviews data on how think tanks become successful and influence the policy debate in a country such as the United States. It appears clearly that in countries such as the United States think tanks smooth the decision making processes, and that this model with some adaptations also might be transferrable to other democratic countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of security violations is increasing and a security breach could have irreversible impacts to business. There are several ways to improve organization security, but some of them may be difficult to comprehend. This thesis demystifies threat modeling as part of secure system development. Threat modeling enables developers to reveal previously undetected security issues from computer systems. It offers a structured approach for organizations to find and address threats against vulnerabilities. When implemented correctly threat modeling will reduce the amount of defects and malicious attempts against the target environment. In this thesis Microsoft Security Development Lifecycle (SDL) is introduced as an effective methodology for reducing defects in the target system. SDL is traditionally meant to be used in software development, principles can be however partially adapted to IT-infrastructure development. Microsoft threat modeling methodology is an important part of SDL and it is utilized in this thesis to find threats from the Acme Corporation’s factory environment. Acme Corporation is used as a pseudonym for a company providing high-technology consumer electronics. Target for threat modeling is the IT-infrastructure of factory’s manufacturing execution system. Microsoft threat modeling methodology utilizes STRIDE –mnemonic and data flow diagrams to find threats. Threat modeling in this thesis returned results that were important for the organization. Acme Corporation now has more comprehensive understanding concerning IT-infrastructure of the manufacturing execution system. On top of vulnerability related results threat modeling provided coherent views of the target system. Subject matter experts from different areas can now agree upon functions and dependencies of the target system. Threat modeling was recognized as a useful activity for improving security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the recent years, smart grids have received great public attention. Many proposed functionalities rely on power electronics, which play a key role in the smart grid, together with the communication network. However, “smartness” is not the driver that alone motivates the research towards distribution networks based on power electronics; the network vulnerability to natural hazards has resulted in tightening requirements for the supply security, set both by electricity end-users and authorities. Because of the favorable price development and advancements in the field, direct current (DC) distribution has become an attractive alternative for distribution networks. In this doctoral dissertation, power electronic converters for a low-voltage DC (LVDC) distribution system are investigated. These include the rectifier located at the beginning of the LVDC network and the customer-end inverter (CEI) on the customer premises. Rectifier topologies are introduced, and according to the LVDC system requirements, topologies are chosen for the analysis. Similarly, suitable CEI topologies are addressed and selected for study. Application of power electronics into electricity distribution poses some new challenges. Because the electricity end-user is supplied with the CEI, it is responsible for the end-user voltage quality, but it also has to be able to supply adequate current in all operating conditions, including a short-circuit, to ensure the electrical safety. Supplying short-circuit current with power electronics requires additional measures, and therefore, the short-circuit behavior is described and methods to overcome the high-current supply to the fault are proposed. Power electronic converters also produce common-mode (CM) and radio-frequency (RF) electromagnetic interferences (EMI), which are not present in AC distribution. Hence, their magnitudes are investigated. To enable comprehensive research on the LVDC distribution field, a research site was built into a public low-voltage distribution network. The implementation was a joint task by the LVDC research team of Lappeenranta University of Technology and a power company Suur-Savon S¨ahk¨o Oy. Now, the measurements could be conducted in an actual environment. This is important especially for the EMI studies. The main results of the work concern the short-circuit operation of the CEI and the EMI issues. The applicability of the power electronic converters to electricity distribution is demonstrated, and suggestions for future research are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly growing share of distributed generation in the whole electrical power system’s generating system is currently a worldwide tendency, driven by several factors, encircling mainly difficulties in refinement of megalopolises’ distribution networks and its maintenance; widening environmental concerns adding to both energy efficiency approaches and installation of renewable sources based generation, inherently distributed; increased power quality and reliability needs; progress in IT field, making implementable harmonization of needs and interests of different-energy-type generators and consumers. At this stage, the volume, formed by system-interconnected distributed generation facilities, have reached the level of causing broad impact toward system operation under emergency and post-emergency conditions in several EU countries, thus previously implementable approach of their preliminary tripping in case of a fault, preventing generating equipment damage and disoperation of relay protection and automation, is not applicable any more. Adding to the preceding, withstand capability and transient electromechanical stability of generating technologies, interconnecting in proximity of load nodes, enhanced significantly since the moment Low Voltage Ride-Through regulations, followed by techniques, were introduced in Grid Codes. Both aspects leads to relay protection and auto-reclosing operation in presence of distributed generation generally connected after grid planning and construction phases. This paper proposes solutions to the emerging need to ensure correct operation of the equipment in question with least possible grid refinements, distinctively for every type of distributed generation technology achieved its technical maturity to date and network’s protection. New generating technologies are equivalented from the perspective of representation in calculation of initial steady-state short-circuit current used to dimension current-sensing relay protection, and widely adopted short-circuit calculation practices, as IEC 60909 and VDE 0102. The phenomenon of unintentional islanding, influencing auto-reclosing, is addressed, and protection schemes used to eliminate an sustained island are listed and characterized by reliability and implementation related factors, whereas also forming a crucial aspect of realization of the proposed protection operation relieving measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compared the relative effectiveness of two computerized remedial reading programs in improving the reading word recognition, rate, and comprehension of adolescent readers demonstrating significant and longstanding reading difficulties. One of the programs involved was Autoskill Component Reading Subskills Program, which provides instruction in isolated letters, syllables, and words, to a point of rapid automatic responding. This program also incorporates reading disability subtypes in its approach. The second program, Read It Again. Sam, delivers a repeated reading strategy. The study also examined the feasibility of using peer tutors in association with these two programs. Grade 9 students at a secondary vocational school who satisfied specific criteria with respect to cognitive and reading ability participated. Eighteen students were randomly assigned to three matched groups, based on prior screening on a battery of reading achievement tests. Two I I groups received training with one of the computer programs; the third group acted as a control and received the remedial reading program offered within the regular classroom. The groups met daily with a trained tutor for approximately 35 minutes, and were required to accumulate twenty hours of instruction. At the conclusion of the program, the pretest battery was repeated. No significant differences were found in the treatment effects of the two computer groups. Each of the two treatment groups was able to effect significantly improved reading word recognition and rate, relative to the control group. Comprehension gains were modest. The treatment groups demonstrated a significant gain, relative to the control group, on one of the three comprehension measures; only trends toward a gain were noted on the remaining two measures. The tutoring partnership appeared to be a viable alternative for the teacher seeking to provide individualized computerized remedial programs for adolescent unskilled readers. Both programs took advantage of computer technology in providing individualized drill and practice, instant feedback, and ongoing recordkeeping. With limited cautions, each of these programs was considered effective and practical for use with adolescent unskilled readers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although there is a consensus in th~ literature on the many uses of the Internet in education, as well as the unique features of the Internet for presenting facts and information, there is no consensus on a standardized method for evaluating Internetbased courseware. Educators rarely have the opportunity to participate in the development of Internet-based courseware, yet they are encouraged to use the technology in their learning environments. This creates a need for summative evaluation methods for Internet-based health courseware. The purpose ofthis study was to assess evaluative measures for Internet-based courseware. Specifically, two entities were evaluated within the study: a) the outcome of the Internet-based courseware, and b) the Internet-based courseware itself. To this end, the Web site www.bodymatters.com was evaluated using two different approaches by two different cohorts. The first approach was a performance appraisal by a group of endusers. A positive, statistically significant change in the students performance was observed due to the intervention ofthe Web site. The second approach was a productoriented evaluation ofthe Web site with the use of a criterion-based checklist and an open-ended comments section. The findings indicate that a summative, criterion-based evaluation is best completed by a multidisciplinary team. The findi~gs also indicated that the two different cohorts reported different product-oriented appraisals of the Web site. The current research confirmed previous research that found that experts returning a poor evaluation of a Web site did not have a relationship to whether or not the end-users performance improved due to the intervention of the Web site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents two studies, both examining the efficacy of a computer programme (Captain's Log) in training attentional skills. The population of interest is the traumatically brain injured. Study #1 is a single-case design that offers recommendations for the second, .larger (N=5) inquiry. Study #2 is an eight-week hierarchical treatment programme with a multi-based testing component. Attention, memory, listening comprehension, locus-of-control, self-esteem, visuo-spatial, and general outcome measures are employed within the testing schedule. Results suggest that any improvement was a result of practice effects. With a few single-case exceptions, the participants showed little improvement in the dependent measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex networks can arise naturally and spontaneously from all things that act as a part of a larger system. From the patterns of socialization between people to the way biological systems organize themselves, complex networks are ubiquitous, but are currently poorly understood. A number of algorithms, designed by humans, have been proposed to describe the organizational behaviour of real-world networks. Consequently, breakthroughs in genetics, medicine, epidemiology, neuroscience, telecommunications and the social sciences have recently resulted. The algorithms, called graph models, represent significant human effort. Deriving accurate graph models is non-trivial, time-intensive, challenging and may only yield useful results for very specific phenomena. An automated approach can greatly reduce the human effort required and if effective, provide a valuable tool for understanding the large decentralized systems of interrelated things around us. To the best of the author's knowledge this thesis proposes the first method for the automatic inference of graph models for complex networks with varied properties, with and without community structure. Furthermore, to the best of the author's knowledge it is the first application of genetic programming for the automatic inference of graph models. The system and methodology was tested against benchmark data, and was shown to be capable of reproducing close approximations to well-known algorithms designed by humans. Furthermore, when used to infer a model for real biological data the resulting model was more representative than models currently used in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex networks are systems of entities that are interconnected through meaningful relationships. The result of the relations between entities forms a structure that has a statistical complexity that is not formed by random chance. In the study of complex networks, many graph models have been proposed to model the behaviours observed. However, constructing graph models manually is tedious and problematic. Many of the models proposed in the literature have been cited as having inaccuracies with respect to the complex networks they represent. However, recently, an approach that automates the inference of graph models was proposed by Bailey [10] The proposed methodology employs genetic programming (GP) to produce graph models that approximate various properties of an exemplary graph of a targeted complex network. However, there is a great deal already known about complex networks, in general, and often specific knowledge is held about the network being modelled. The knowledge, albeit incomplete, is important in constructing a graph model. However it is difficult to incorporate such knowledge using existing GP techniques. Thus, this thesis proposes a novel GP system which can incorporate incomplete expert knowledge that assists in the evolution of a graph model. Inspired by existing graph models, an abstract graph model was developed to serve as an embryo for inferring graph models of some complex networks. The GP system and abstract model were used to reproduce well-known graph models. The results indicated that the system was able to evolve models that produced networks that had structural similarities to the networks generated by the respective target models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of mutation in genes, which is a simple change in our DNA, we will have undesirable phenotypes which are known as genetic diseases or disorders. These small changes, which happen frequently, can have extreme results. Understanding and identifying these changes and associating these mutated genes with genetic diseases can play an important role in our health, by making us able to find better diagnosis and therapeutic strategies for these genetic diseases. As a result of years of experiments, there is a vast amount of data regarding human genome and different genetic diseases that they still need to be processed properly to extract useful information. This work is an effort to analyze some useful datasets and to apply different techniques to associate genes with genetic diseases. Two genetic diseases were studied here: Parkinson’s disease and breast cancer. Using genetic programming, we analyzed the complex network around known disease genes of the aforementioned diseases, and based on that we generated a ranking for genes, based on their relevance to these diseases. In order to generate these rankings, centrality measures of all nodes in the complex network surrounding the known disease genes of the given genetic disease were calculated. Using genetic programming, all the nodes were assigned scores based on the similarity of their centrality measures to those of the known disease genes. Obtained results showed that this method is successful at finding these patterns in centrality measures and the highly ranked genes are worthy as good candidate disease genes for being studied. Using standard benchmark tests, we tested our approach against ENDEAVOUR and CIPHER - two well known disease gene ranking frameworks - and we obtained comparable results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The KCube interconnection network was first introduced in 2010 in order to exploit the good characteristics of two well-known interconnection networks, the hypercube and the Kautz graph. KCube links up multiple processors in a communication network with high density for a fixed degree. Since the KCube network is newly proposed, much study is required to demonstrate its potential properties and algorithms that can be designed to solve parallel computation problems. In this thesis we introduce a new methodology to construct the KCube graph. Also, with regard to this new approach, we will prove its Hamiltonicity in the general KC(m; k). Moreover, we will find its connectivity followed by an optimal broadcasting scheme in which a source node containing a message is to communicate it with all other processors. In addition to KCube networks, we have studied a version of the routing problem in the traditional hypercube, investigating this problem: whether there exists a shortest path in a Qn between two nodes 0n and 1n, when the network is experiencing failed components. We first conditionally discuss this problem when there is a constraint on the number of faulty nodes, and subsequently introduce an algorithm to tackle the problem without restrictions on the number of nodes.