856 resultados para Analysis of Algorithms and Problem Complexity
Resumo:
This report prioritizes the targeted, additional resources First Steps and system stakeholders believe will be necessary to ensure the BabyNet system earns a federal designation of “meets requirements” for the first time in its 25 year history. It lists key recommendations to help meet those requirements.
Resumo:
The initial aim of the CareMan project was to develop a joint degree programme that combined and utilised the strengths of the five collaborating universities that were already involved in delivering social and health care management education. Because the project was to be implemented in collaboration between education- al institutions, the collaboration had to be based on a detailed understanding of the national and institutional specifics of each of the individual academic enti- ties. During this process it was recognised that, due to a number of regulation issues, achieving the original aim would not be possible; ultimately, following a series of analytical works, which are presented below, it was decided that a set of three master’s level modules should be developed. One of the reasons was that the Finnish law on master’s degrees at universities of applied sciences (UAS) stated that the requirement for entry to a UAS master’s programme was a bachelor degree from a UAS or equivalent, plus a minimum of three years of work experience in an appropriate field. The three years’ work experience is also required from international students. In practice this meant that the participating Finnish UASs, Lahti and HAMK, could not award a diploma for foreign students without this work experience. The other European universities do not have the work experience requirement, although some take it as a bonus for admission (FHS UK). There were also other differences in law (e.g., requirements for mini - mum standards in Social Work education at FHS UK) that could not have been overcome during the period of project realisation. Consequently, the outcome was the development of only three common educational modules, each for 10 ECTS, which were developed, delivered and assessed during the lifetime of the project. The intention was that these would be integrated into the current masters’ level provision in each of the universities
Resumo:
The thesis assesses the impact of international factors on relations between Greek and Turkish Cypriots during and after the Cold War. Through an analysis of the Cyprus problem it explores both why external actors intervene in communal conflicts and how they influence relations between ethnic groups in plural societies. The analytical framework employed throughout the study draws on contributions of International Relations theorists and students of ethnic conflict. The thesis argues that, as in the global political system, relations between ethnic groups in unranked communal systems are anarchic; that is, actors within the system do not recognize a sovereign political authority. In bipolar communal systems dominated by two relatively equal groups, the struggle for security and power often leads to appeals for assistance from external actors. The framework notes that neighboring states and Great Powers may heed calls for assistance, or intervene without a prior request, if it is in their interest to do so. The convergence of regional and global interests in communal affairs exacerbates ethnic conflicts and precludes the development of effective political institutions. The impact of external intervention in ethnic conflicts has the potential to alter the basis of communal relations. The Cyprus problem is examined both during and after the Cold War in order to gauge how global and regional actors and the structure of their respective systems have affected relations between ethnic groups in Cyprus. The thesis argues that Cyprus's descent into civil war in 1963 was due in part to the entrenchment of external interests in the Republic's constitution. The study also notes that power politics involving the United States, Soviet Union, Greece and Turkey continued to affect the development of communal relations throughout the 1960s, 70s, and, 80s. External intervention culminated in July and August 1974, after a Greek sponsored coup was answered by Turkey's invasion and partition of Cyprus. The forced expulsion of Greek Cypriots from the island's northern territories led to the establishment of ethnically homogeneous zones, thus altering the context of communal relations dramatically. The study also examines the role of the United Nations in Cyprus, noting that its failure to settle the dispute was due in large part to a lack of cooperation from Turkey, and the United States' and Soviet Union's acceptance of the status quo following the 1974 invasion and partition of the island. The thesis argues that the deterioration of Greek-Turkish relations in the post-Cold War era has made a solution to the dispute unlikely for the time being. Barring any dramatic changes in relations between communal and regional antagonists, relations between Greek and Turkish Cypriots will continue to develop along the lines established in July/August 1974. The thesis concludes by affirming the validity of its core hypotheses through a brief survey of recent works touching on international politics and ethnic conflict. Questions requiring further research are noted as are elements of the study that require further refinement.
Resumo:
Competing hypotheses seek to explain the evolution of oxygenic and anoxygenic processes of photosynthesis. Since chlorophyll is less reduced and precedes bacteriochlorophyll on the modern biosynthetic pathway, it has been proposed that chlorophyll preceded bacteriochlorophyll in its evolution. However, recent analyses of nucleotide sequences that encode chlorophyll and bacteriochlorophyll biosynthetic enzymes appear to provide support for an alternative hypothesis. This is that the evolution of bacteriochlorophyll occurred earlier than the evolution of chlorophyll. Here we demonstrate that the presence of invariant sites in sequence datasets leads to inconsistency in tree building (including maximum-likelihood methods). Homologous sequences with different biological functions often share invariant sites at the same nucleotide positions. However, different constraints can also result in additional invariant sites unique to the genes, which have specific and different biological functions. Consequently, the distribution of these sites can be uneven between the different types of homologous genes. The presence of invariant sites, shared by related biosynthetic genes as well as those unique to only some of these genes, has misled the recent evolutionary analysis of oxygenic and anoxygenic photosynthetic pigments. We evaluate an alternative scheme for the evolution of chlorophyll and bacteriochlorophyll.
Resumo:
We examine the use of randomness extraction and expansion in key agreement (KA) pro- tocols to generate uniformly random keys in the standard model. Although existing works provide the basic theorems necessary, they lack details or examples of appropriate cryptographic primitives and/or parameter sizes. This has lead to the large amount of min-entropy needed in the (non-uniform) shared secret being overlooked in proposals and efficiency comparisons of KA protocols. We therefore summa- rize existing work in the area and examine the security levels achieved with the use of various extractors and expanders for particular parameter sizes. The tables presented herein show that the shared secret needs a min-entropy of at least 292 bits (and even more with more realistic assumptions) to achieve an overall security level of 80 bits using the extractors and expanders we consider. The tables may be used to �nd the min-entropy required for various security levels and assumptions. We also �nd that when using the short exponent theorems of Gennaro et al., the short exponents may need to be much longer than they suggested.
Resumo:
Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective than developing them from scratch. As process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. Experiments are conducted to demonstrate that our proposal achieves a significant reduction in query evaluation time.
Resumo:
In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.
Resumo:
"The Structure and Interpretation of Computer Programs" is the entry-level subject in Computer Science at the Massachusetts Institute of Technology. It is required of all students at MIT who major in Electrical Engineering or in Computer Science, as one fourth of the "common core curriculum," which also includes two subjects on circuits and linear systems and a subject on the design of digital systems. We have been involved in the development of this subject since 1978, and we have taught this material in its present form since the fall of 1980 to approximately 600 students each year. Most of these students have had little or no prior formal training in computation, although most have played with computers a bit and a few have had extensive programming or hardware design experience. Our design of this introductory Computer Science subject reflects two major concerns. First we want to establish the idea that a computer language is not just a way of getting a computer to perform operations, but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Secondly, we believe that the essential material to be addressed by a subject at this level, is not the syntax of particular programming language constructs, nor clever algorithms for computing particular functions of efficiently, not even the mathematical analysis of algorithms and the foundations of computing, but rather the techniques used to control the intellectual complexity of large software systems.
Resumo:
This paper investigates the problem of seepage under the floor of hydraulic structures considering the compartment of flow that seeps through the surrounding banks of the canal. A computer program, utilizing a finite-element method and capable of handling three-dimensional (3D) saturated–unsaturated flow problems, was used. Different ratios of canal width/differential head applied on the structure were studied. The results produced from the two-dimensional (2D) analysis were observed to deviate largely from that obtained from 3D analysis of the same problem, despite the fact that the porous medium was isotropic and homogeneous. For example, the exit gradient obtained from 3D analysis was as high as 2.5 times its value obtained from 2D analysis. Uplift force acting upwards on the structure has also increased by about 46% compared with its value obtained from the 2D solution. When the canal width/ differential head ratio was 10 or higher, the 3D results were comparable to the 2D results. It is recommended to construct a core of low permeability soil in the banks of canal to reduce the seepage losses, uplift force, and exit gradient.
Resumo:
A Cauchy problem for general elliptic second-order linear partial differential equations in which the Dirichlet data in H½(?1 ? ?3) is assumed available on a larger part of the boundary ? of the bounded domain O than the boundary portion ?1 on which the Neumann data is prescribed, is investigated using a conjugate gradient method. We obtain an approximation to the solution of the Cauchy problem by minimizing a certain discrete functional and interpolating using the finite diference or boundary element method. The minimization involves solving equations obtained by discretising mixed boundary value problems for the same operator and its adjoint. It is proved that the solution of the discretised optimization problem converges to the continuous one, as the mesh size tends to zero. Numerical results are presented and discussed.
Resumo:
Systems analysis (SA) is widely used in complex and vague problem solving. Initial stages of SA are analysis of problems and purposes to obtain problems/purposes of smaller complexity and vagueness that are combined into hierarchical structures of problems(SP)/purposes(PS). Managers have to be sure the PS and the purpose realizing system (PRS) that can achieve the PS-purposes are adequate to the problem to be solved. However, usually SP/PS are not substantiated well enough, because their development is based on a collective expertise in which logic of natural language and expert estimation methods are used. That is why scientific foundations of SA are not supposed to have been completely formed. The structure-and-purpose approach to SA based on a logic-and-linguistic simulation of problems/purposes analysis is a step towards formalization of the initial stages of SA to improve adequacy of their results, and also towards increasing quality of SA as a whole. Managers of industrial organizing systems using the approach eliminate logical errors in SP/PS at early stages of planning and so they will be able to find better decisions of complex and vague problems.
Resumo:
Finite-Differences Time-Domain (FDTD) algorithms are well established tools of computational electromagnetism. Because of their practical implementation as computer codes, they are affected by many numerical artefact and noise. In order to obtain better results we propose using Principal Component Analysis (PCA) based on multivariate statistical techniques. The PCA has been successfully used for the analysis of noise and spatial temporal structure in a sequence of images. It allows a straightforward discrimination between the numerical noise and the actual electromagnetic variables, and the quantitative estimation of their respective contributions. Besides, The GDTD results can be filtered to clean the effect of the noise. In this contribution we will show how the method can be applied to several FDTD simulations: the propagation of a pulse in vacuum, the analysis of two-dimensional photonic crystals. In this last case, PCA has revealed hidden electromagnetic structures related to actual modes of the photonic crystal.
Resumo:
Background: Financial abuse of elders is an under acknowledged problem and professionals' judgements contribute to both the prevalence of abuse and the ability to prevent and intervene. In the absence of a definitive "gold standard" for the judgement, it is desirable to try and bring novice professionals' judgemental risk thresholds to the level of competent professionals as quickly and effectively as possible. This study aimed to test if a training intervention was able to bring novices' risk thresholds for financial abuse in line with expert opinion. Methods: A signal detection analysis, within a randomised controlled trial of an educational intervention, was undertaken to examine the effect on the ability of novices to efficiently detect financial abuse. Novices (n = 154) and experts (n = 33) judged "certainty of risk" across 43 scenarios; whether a scenario constituted a case of financial abuse or not was a function of expert opinion. Novices (n = 154) were randomised to receive either an on-line educational intervention to improve financial abuse detection (n = 78) or a control group (no on-line educational intervention, n = 76). Both groups examined 28 scenarios of abuse (11 "signal" scenarios of risk and 17 "noise" scenarios of no risk). After the intervention group had received the on-line training, both groups then examined 15 further scenarios (5 "signal" and 10 "noise" scenarios). Results: Experts were more certain than the novices, pre (Mean 70.61 vs. 58.04) and post intervention (Mean 70.84 vs. 63.04); and more consistent. The intervention group (mean 64.64) were more certain of abuse post-intervention than the control group (mean 61.41, p = 0.02). Signal detection analysis of sensitivity (Á) and bias (C) revealed that this was due to the intervention shifting the novices' tendency towards saying "at risk" (C post intervention -.34) and away from their pre intervention levels of bias (C-.12). Receiver operating curves revealed more efficient judgments in the intervention group. Conclusion: An educational intervention can improve judgements of financial abuse amongst novice professionals.
Resumo:
For most of the work done in developing association rule mining, the primary focus has been on the efficiency of the approach and to a lesser extent the quality of the derived rules has been emphasized. Often for a dataset, a huge number of rules can be derived, but many of them can be redundant to other rules and thus are useless in practice. The extremely large number of rules makes it difficult for the end users to comprehend and therefore effectively use the discovered rules and thus significantly reduces the effectiveness of rule mining algorithms. If the extracted knowledge can’t be effectively used in solving real world problems, the effort of extracting the knowledge is worth little. This is a serious problem but not yet solved satisfactorily. In this paper, we propose a concise representation called Reliable Approximate basis for representing non-redundant approximate association rules. We prove that the redundancy elimination based on the proposed basis does not reduce the belief to the extracted rules. We also prove that all approximate association rules can be deduced from the Reliable Approximate basis. Therefore the basis is a lossless representation of approximate association rules.
Resumo:
Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective and less error-prone than developing them from scratch. Since process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. To make our approach more applicable, we consider the semantic similarity between labels. Experiments are conducted to demonstrate that our approach is efficient.