992 resultados para algorithm Context
Resumo:
The development of software tools begun as the first computers were built. The current generation of development environments offers a common interface to access multiple software tools and often also provide a possibility to build custom tools as extensions to the existing development environment. Eclipse is an open source development environment that offers good starting point for developing custom extensions. This thesis presents a software tool to aid the development of context-aware applications on Multi-User Publishing Environment (MUPE) platform. The tool is implemented as an Eclipse plug-in. The tool allows developer to include external server side contexts to their MUPE applications. The tool allows additional context sources to be added through the Eclipse's extension point mechanism. The thesis describes how the tool was designed and implemented. The implementation consists of tool core component part and an additional context source extension part. Tool core component is responsible for the actual context addition and also provides the needed user interface elements to the Eclipse workbench. Context source component provides the needed context source related information to the core component. As part of the work an update site feature was also implemented for distributing the tool through Eclipse update mechanism.
Resumo:
In the Russian Wholesale Market, electricity and capacity are traded separately. Capacity is a special good, the sale of which obliges suppliers to keep their generating equipment ready to produce the quantity of electricity indicated by the System Operator. The purpose of the formation of capacity trading was the maintenance of reliable and uninterrupted delivery of electricity in the wholesale market. The price of capacity reflects constant investments in construction, modernization and maintenance of power plants. So, the capacity sale creates favorable conditions to attract investments in the energy sector because it guarantees the investor that his investments will be returned.
Resumo:
In this essay, I argue that someone who adopted a falsificationism of the sort that I have attributed to Nietzsche would be attracted to the doctrine of eternal recurrence. For Nietzsche, to think the becoming revealed through the senses means falsifying it through being. But the eternal recurrence offers the possibility of thinking becoming without falsification. I then argue that someone who held Nietzsche's falsificationism would see in human agency a conflict between being and becoming similar to that in empirical judgment. In the light of this conflict only the eternal recurrence would offer the possibility of truly affirming life. I end by discussing how this reading of the eternal recurrence solves a number of puzzles that have bedeviled interpreters.
Resumo:
Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.
Resumo:
The main objective of the study was to identify and evaluate criteria for international partner selection in university-university context. This study attempted at promoting better understanding of how universities should proceed in selecting partners for producing joint research publications. Thus, the aim of the study was to gain an understanding of how research collaborations can be developed and how partners can be selected. The choice of a right partner has been identified as a precondition for partnership success. In international research collaborations partnering scientists with different skills and backgrounds bring together complementary knowledge into research projects, which in most cases results in a higher quality output. Therefore, prior to selecting a partner, the set of criteria should be established. This research examined twelve Russian universities with the status of national research university as potential partners for Lappeenranta University of Technology, and selected the most appropriate universities based on established set of criteria. Potential partners’ evaluation was done using secondary sources by tracking partners’ academic success during the period 2005 – 2010. Based on established criteria, the study calculated the partnership index for each university. The results of the research reveal that among twelve examined universities there are four potential partners who have been rather active in publishing scientific articles during 2005 – 2010.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.
Resumo:
I mars 2003 certifierades en finländsk advokatbyrå av den Europeiska kommissionen som den bästa i Europa inom specialkategorin livslångt lärande. Advokatbyrån var överraskad över utnämningen emedan de inte aktivt och/eller medvetet implementerat eller utövat en livslångt lärandestrategi i sin verksamhet bland sin personal. Byrån deltog i en tävling om bästa arbetsplats i Europa ("Best workplaces in Europe 2003") utan att vara medveten om den Europeiska kommissionens special- kategorier. Emedan advokatbyrån inte medvetet implementerat en livslångt lärandestrategi bland sin personal formar aktörerna, vars uppfattning och prat denna avhandling handlar om, sina föreställningar och sitt prat om livslångt lärande efter utnämningen. Översättningsprocessen av en idé utlöses sålunda i denna studie av en extern händelse. I sin avhandling beskriver Annica Isacsson hur och varför en idé (livslångt lärande) föds (på nytt) på en institutionell nivå, hur idén reser och förändras i en process av översättning, hur idén landar i två organisationer samt hur idén om livslångt lärande uppfattas och beskrivs av lokala aktörer i två olika organisationer. Fokus i studien ligger sålunda på enskilda aktörers uppfattning om ett kontroversiellt koncept i en lokal kontext. Teoretiskt möts och sammanlänkas teori om livslångt lärande, sociokulturella teorier om lärande och teorier om organisatoriskt lärande. Isacssons avhandling visar på hur livslångt lärande inte enbart, i en organisatorisk kontext, handlar om individuell kompetensutveckling utan också om organisatoriskt lärande i vilken lärande av andra organisationsmedlemmar och organisationer ingår. Studien visar vidare på hur enskilda aktörers prat påverkas av det institutionella fältet och av den tidsanda inom vilken diskursen livslångt lärande föds, rör sig och ingår.
Resumo:
In this work a fuzzy linear system is used to solve Leontief input-output model with fuzzy entries. For solving this model, we assume that the consumption matrix from di erent sectors of the economy and demand are known. These assumptions heavily depend on the information obtained from the industries. Hence uncertainties are involved in this information. The aim of this work is to model these uncertainties and to address them by fuzzy entries such as fuzzy numbers and LR-type fuzzy numbers (triangular and trapezoidal). Fuzzy linear system has been developed using fuzzy data and it is solved using Gauss-Seidel algorithm. Numerical examples show the e ciency of this algorithm. The famous example from Prof. Leontief, where he solved the production levels for U.S. economy in 1958, is also further analyzed.
Resumo:
Politiskt deltagande är en definierande del av varje demokratiskt politiskt system, även mellan valen. Men det har skett en betydande utveckling i vilka aktiviteter som uppfattas som politiskt deltagande. Det är inte enbart aktiviteter i politiska partier som är i fokus, men också olika protestaktiviteter, delaktighet i nya sociala rörelser och livsstilspolitik i form av politisk konsumtion. Politiskt deltagande mellan valen kan leda till en potentiell legitimitetskonflikt. Den potentiella konflikten mellan ansvarsutkrävande och medborgarnas aktiva medverkan har varit känd sedan länge. Representativa demokratier har genom olika institutionella mekanismer försökt konstruera ett fungerande politiskt system som förenar möjligheten för politiskt deltagande med en tydlig ansvarsstruktur. I detta sammanhang har den institutionella öppenheten haft en central position eftersom denna antas påverka hur lätt det är för medborgarna att påverka de formella beslutfattarna. Avhandlingen undersöker därmed konsekvenserna av institutionell öppenhet för olika former av politiskt deltagande. Resultaten tyder på att demokratiska staters institutionella uppbyggnad har väsentliga konsekvenser för det politiska deltagandet. Men samspelet mellan systemet och deltagandet verkar vara mera invecklat än de dominerande teorierna om politiska institutioner och deltagande ger vid handen. Institutionell öppenhet har inte den förväntade effekt beroende på om den politiska handlingen sker inom eller utanför det formella systemet, och den institutionella effekten är mera uttalad för föreningsaktivism och politisk konsumtion, vilket är de aktiviteter som ligger längst bort från det formella politiska systemet. Resultaten utmanar därmed centrala teoretiska antaganden inom forskningen om politiskt deltagande. I ljuset av de resultat som presenteras i avhandlingen framstår det som särskilt angeläget att omvärdera effekten av institutionell öppenhet.
Resumo:
Leadership is essential for the effectiveness of the teams and organizations they are part of. The challenges facing organizations today require an exhaustive review of the strategic role of leadership. In this context, it is necessary to explore new types of leadership capable of providing an effective response to new needs. The presentday situations, characterized by complexity and ambiguity, make it difficult for an external leader to perform all leadership functions successfully. Likewise, knowledge-based work requires providing professional groups with sufficient autonomy to perform leadership functions. This study focuses on shared leadership in the team context. Shared leadership is seen as an emergent team property resulting from the distribution of leadership influence across multiple team members. Shared leadership entails sharing power and influence broadly among the team members rather than centralizing it in the hands of a single individual who acts in the clear role of a leader. By identifying the team itself as a key source of influence, this study points to the relational nature of leadership as a social construct where leadership is seen as social process of relating processes that are co-constructed by several team members. Based on recent theoretical developments concerned with relational, practice-based and constructionist approaches to the study of leadership processes, this thesis proposes the study of leadership interactions, working processes and practices to focus on the construction of direction, alignment and commitment. During the research process, critical events, activities, working processes and practices of a case team have been examined and analyzed with the grounded theory –approach in the terms of shared leadership. There are a variety of components to this complex process and a multitude of factors that may influence the development of shared leadership. The study suggests that the development process of shared leadership is a common sense -making process and consists of four overlapping dimensions (individual, social, structural, and developmental) to work with as a team. For shared leadership to emerge, the members of the team must offer leadership services, and the team as a whole must be willing to rely on leadership by multiple team members. For these individual and collective behaviors to occur, the team members must believe that offering influence to and accepting it from fellow team members are welcome and constructive actions. Leadership emerges when people with differing world views use dialogue and collaborative learning to create spaces where a shared common purpose can be achieved while a diversity of perspectives is preserved and valued. This study also suggests that this process can be supported by different kinds of meaning-making and process tools. Leadership, then, does not reside in a person or in a role, but in the social system. The built framework integrates the different dimensions of shared leadership and describes their relationships. This way, the findings of this study can be seen as a contribution to the understanding of what constitutes essential aspects of shared leadership in the team context that can be of theoretical value in terms of advancing the adoption and development process of shared leadership. In the real world, teams and organizations can create conditions to foster and facilitate the process. We should encourage leaders and team members to approach leadership as a collective effort that the team can be prepared for, so that the response is rapid and efficient.
Resumo:
I doktorsavhandlingen undersöks förmågan att lösa hos ett antal lösare för optimeringsproblem och ett antal svårigheter med att göra en rättvis lösarjämförelse avslöjas. Dessutom framläggs några förbättringar som utförts på en av lösarna som heter GAMS/AlphaECP. Optimering innebär, i det här sammanhanget, att finna den bästa möjliga lösningen på ett problem. Den undersökta klassen av problem kan karaktäriseras som svårlöst och förekommer inom ett flertal industriområden. Målet har varit att undersöka om det finns en lösare som är universellt snabbare och hittar lösningar med högre kvalitet än någon av de andra lösarna. Det kommersiella optimeringssystemet GAMS (General Algebraic Modeling System) och omfattande problembibliotek har använts för att jämföra lösare. Förbättringarna som presenterats har utförts på GAMS/AlphaECP lösaren som baserar sig på skärplansmetoden Extended Cutting Plane (ECP). ECP-metoden har utvecklats främst av professor Tapio Westerlund på Anläggnings- och systemteknik vid Åbo Akademi.
Resumo:
The net radiation (Rn) represents the main source of energy for physical and chemical processes that occur in the surface-atmosphere interface, and it is used for air and soil heating, water transfer, in the form of vapor from the surface to the atmosphere, and for the metabolism of plants, especially photosynthesis. If there is no record of net radiation in certain areas, the use of information is important to help determine it. Among them we can highlight those provided by remote sensing. In this context, this work aims to estimate the net radiation, with the use of products of MODIS sensor, in the sub-basins of Entre Ribeiros creek and Preto River, located between the Brazilian states of Goiás and Minas Gerais. The SEBAL (Surface Energy Balance Algorithm for Land) was used to obtain the Rn in four different days in the period of July to October, 2007. The Rn results obtained were consistent with others cited in the literature and are important because the orbital information can help determine the Rn in areas where there are not automatic weather stations to record the net radiation.
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.