820 resultados para Search-based algorithms
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
BCM (business continuity Management) is a holistic management process aiming at ensuring business continuity and building organizational resilience. Maturity models offer organizations a tool for evaluating their current maturity in a certain process. In the recent years BCM has been subject to international ISO standardization, while the interest of organizations to bechmark their state of BCM agains standards and the use of maturity models for these asessments has increased. However, although new standards have been introduced, very little attention has been paid to reviewing the existing BCM maturity models in research - especially in the light of the new ISO 22301 standard for BCM. In this thesis the existing BCM maturily models are carefully evaluated to determine whetherthey could be improved. In order to accomplish this, the compliance of the existing models to the ISO 22301 standard is measured and a framework for assessing a maturitymodel´s quality is defined. After carefully evaluating the existing frameworks for maturity model development and evaluation, an approach suggested by Becker et al. (2009) was chosen as the basis for the research. An additionto the procedural model a set of seven research guidelines proposed by the same authors was applied, drawing on the design-science research guidelines as suggested by Hevner et al. (2004). Furthermore, the existing models´ form and function was evaluated to address their usability. Based on the evaluation of the existing BCM maturity models, the existing models were found to have shortcomings in each dimension of the evaluation. Utilizing the best of the existing models, a draft version for an enhanced model was developed. This draft model was then iteratively developed by conducting six semi-structured interviews with BCM professionals in finland with the aim of validating and improving it. As a Result, a final version of the enhanced BCM maturity model was developed, conforming to the seven key clauses in the ISO 22301 standard and the maturity model development guidelines suggested by Becker et al. (2009).
Resumo:
The objective of this study was to identify restriction fragment length polymorphism (RFLP) markers linked to QTLs that control aluminum (Al) tolerance in maize. The strategy used was bulked segregant analysis (BSA) and the genetic material utilized was an F2 population derived from a cross between the Al-susceptible inbred line L53 and Al-tolerant inbred line L1327. Both lines were developed at the National Maize and Sorghum Research Center - CNPMS/EMBRAPA. The F2 population of 1554 individuals was evaluated in a nutrient solution containing a toxic concentration of Al and relative seminal root length (RSRL) was used as a phenotypic measure of tolerance. The RSRL frequency distribution was continuous, but skewed towards Al-susceptible individuals. Seedlings of the F2 population which scored the highest and the lowest RSRL values were transplanted to the field and subsequently selfed to obtain F3 families. Thirty F3 families (15 Al-susceptible and 15 Al-tolerant) were evaluated in nutrient solution, using an incomplete block design, to identify those with the smallest variances for aluminum tolerance and susceptibility. Six Al-susceptible and five Al-tolerant F3 families were chosen to construct one pool of Al-susceptible individuals, and another of Al-tolerant, herein referred as "bulks", based on average values of RSRL and genetic variance. One hundred and thirteen probes were selected, with an average interval of 30 cM, covering the 10 maize chromosomes. These were tested for their ability to discriminate the parental lines. Fifty-four of these probes were polymorphic, with 46 showing codominance. These probes were hybridized with DNA from the two contrasting bulks. Three RFLPs on chromosome 8 distinguished the bulks on the basis of band intensity. DNA of individuals from the bulks was hybridized with these probes and showed the presence of heterozygous individuals in each bulk. These results suggest that in maize there is a region related to aluminum tolerance on chromosome 8
Resumo:
Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.
Resumo:
The dissertation proposes two control strategies, which include the trajectory planning and vibration suppression, for a kinematic redundant serial-parallel robot machine, with the aim of attaining the satisfactory machining performance. For a given prescribed trajectory of the robot's end-effector in the Cartesian space, a set of trajectories in the robot's joint space are generated based on the best stiffness performance of the robot along the prescribed trajectory. To construct the required system-wide analytical stiffness model for the serial-parallel robot machine, a variant of the virtual joint method (VJM) is proposed in the dissertation. The modified method is an evolution of Gosselin's lumped model that can account for the deformations of a flexible link in more directions. The effectiveness of this VJM variant is validated by comparing the computed stiffness results of a flexible link with the those of a matrix structural analysis (MSA) method. The comparison shows that the numerical results from both methods on an individual flexible beam are almost identical, which, in some sense, provides mutual validation. The most prominent advantage of the presented VJM variant compared with the MSA method is that it can be applied in a flexible structure system with complicated kinematics formed in terms of flexible serial links and joints. Moreover, by combining the VJM variant and the virtual work principle, a systemwide analytical stiffness model can be easily obtained for mechanisms with both serial kinematics and parallel kinematics. In the dissertation, a system-wide stiffness model of a kinematic redundant serial-parallel robot machine is constructed based on integration of the VJM variant and the virtual work principle. Numerical results of its stiffness performance are reported. For a kinematic redundant robot, to generate a set of feasible joints' trajectories for a prescribed trajectory of its end-effector, its system-wide stiffness performance is taken as the constraint in the joints trajectory planning in the dissertation. For a prescribed location of the end-effector, the robot permits an infinite number of inverse solutions, which consequently yields infinite kinds of stiffness performance. Therefore, a differential evolution (DE) algorithm in which the positions of redundant joints in the kinematics are taken as input variables was employed to search for the best stiffness performance of the robot. Numerical results of the generated joint trajectories are given for a kinematic redundant serial-parallel robot machine, IWR (Intersector Welding/Cutting Robot), when a particular trajectory of its end-effector has been prescribed. The numerical results show that the joint trajectories generated based on the stiffness optimization are feasible for realization in the control system since they are acceptably smooth. The results imply that the stiffness performance of the robot machine deviates smoothly with respect to the kinematic configuration in the adjacent domain of its best stiffness performance. To suppress the vibration of the robot machine due to varying cutting force during the machining process, this dissertation proposed a feedforward control strategy, which is constructed based on the derived inverse dynamics model of target system. The effectiveness of applying such a feedforward control in the vibration suppression has been validated in a parallel manipulator in the software environment. The experimental study of such a feedforward control has also been included in the dissertation. The difficulties of modelling the actual system due to the unknown components in its dynamics is noticed. As a solution, a back propagation (BP) neural network is proposed for identification of the unknown components of the dynamics model of the target system. To train such a BP neural network, a modified Levenberg-Marquardt algorithm that can utilize an experimental input-output data set of the entire dynamic system is introduced in the dissertation. Validation of the BP neural network and the modified Levenberg- Marquardt algorithm is done, respectively, by a sinusoidal output approximation, a second order system parameters estimation, and a friction model estimation of a parallel manipulator, which represent three different application aspects of this method.
Resumo:
This thesis considers optimization problems arising in printed circuit board assembly. Especially, the case in which the electronic components of a single circuit board are placed using a single placement machine is studied. Although there is a large number of different placement machines, the use of collect-and-place -type gantry machines is discussed because of their flexibility and increasing popularity in the industry. Instead of solving the entire control optimization problem of a collect-andplace machine with a single application, the problem is divided into multiple subproblems because of its hard combinatorial nature. This dividing technique is called hierarchical decomposition. All the subproblems of the one PCB - one machine -context are described, classified and reviewed. The derived subproblems are then either solved with exact methods or new heuristic algorithms are developed and applied. The exact methods include, for example, a greedy algorithm and a solution based on dynamic programming. Some of the proposed heuristics contain constructive parts while others utilize local search or are based on frequency calculations. For the heuristics, it is made sure with comprehensive experimental tests that they are applicable and feasible. A number of quality functions will be proposed for evaluation and applied to the subproblems. In the experimental tests, artificially generated data from Markov-models and data from real-world PCB production are used. The thesis consists of an introduction and of five publications where the developed and used solution methods are described in their full detail. For all the problems stated in this thesis, the methods proposed are efficient enough to be used in the PCB assembly production in practice and are readily applicable in the PCB manufacturing industry.
Resumo:
Since there are some concerns about the effectiveness of highly active antiretroviral therapy in developing countries, we compared the initial combination antiretroviral therapy with zidovudine and lamivudine plus either nelfinavir or efavirenz at a university-based outpatient service in Brazil. This was a retrospective comparative cohort study carried out in a tertiary level hospital. A total of 194 patients receiving either nelfinavir or efavirenz were identified through our electronic database search, but only 126 patients met the inclusion criteria. Patients were included if they were older than 18 years old, naive for antiretroviral therapy, and had at least 1 follow-up visit after starting the antiretroviral regimen. Fifty-one of the included patients were receiving a nelfinavir-based regimen and 75 an efavirenz-based regimen as outpatients. Antiretroviral therapy was prescribed to all patients according to current guidelines. By intention-to-treat (missing/switch = failure), after a 12-month period, 65% of the patients in the efavirenz group reached a viral load <400 copies/mL compared to 41% of the patients in the nelfinavir group (P = 0.01). The mean CD4 cell count increase after a 12-month period was also greater in the efavirenz group (195 x 10(6) cells/L) than in the nelfinavir group (119 x 10(6) cells/L; P = 0.002). The efavirenz-based regimen was superior compared to the nelfinavir-based regimen. The low response rate in the nelfinavir group might be partially explained by the difficulty of using a regimen requiring a higher patient compliance (12 vs 3 pills a day) in a developing country.
Resumo:
We compared the cost-benefit of two algorithms, recently proposed by the Centers for Disease Control and Prevention, USA, with the conventional one, the most appropriate for the diagnosis of hepatitis C virus (HCV) infection in the Brazilian population. Serum samples were obtained from 517 ELISA-positive or -inconclusive blood donors who had returned to Fundação Pró-Sangue/Hemocentro de São Paulo to confirm previous results. Algorithm A was based on signal-to-cut-off (s/co) ratio of ELISA anti-HCV samples that show s/co ratio ³95% concordance with immunoblot (IB) positivity. For algorithm B, reflex nucleic acid amplification testing by PCR was required for ELISA-positive or -inconclusive samples and IB for PCR-negative samples. For algorithm C, all positive or inconclusive ELISA samples were submitted to IB. We observed a similar rate of positive results with the three algorithms: 287, 287, and 285 for A, B, and C, respectively, and 283 were concordant with one another. Indeterminate results from algorithms A and C were elucidated by PCR (expanded algorithm) which detected two more positive samples. The estimated cost of algorithms A and B was US$21,299.39 and US$32,397.40, respectively, which were 43.5 and 14.0% more economic than C (US$37,673.79). The cost can vary according to the technique used. We conclude that both algorithms A and B are suitable for diagnosing HCV infection in the Brazilian population. Furthermore, algorithm A is the more practical and economical one since it requires supplemental tests for only 54% of the samples. Algorithm B provides early information about the presence of viremia.
Resumo:
In this paper, we review the advances of monocular model-based tracking for last ten years period until 2014. In 2005, Lepetit, et. al, [19] reviewed the status of monocular model based rigid body tracking. Since then, direct 3D tracking has become quite popular research area, but monocular model-based tracking should still not be forgotten. We mainly focus on tracking, which could be applied to aug- mented reality, but also some other applications are covered. Given the wide subject area this paper tries to give a broad view on the research that has been conducted, giving the reader an introduction to the different disciplines that are tightly related to model-based tracking. The work has been conducted by searching through well known academic search databases in a systematic manner, and by selecting certain publications for closer examination. We analyze the results by dividing the found papers into different categories by their way of implementation. The issues which have not yet been solved are discussed. We also discuss on emerging model-based methods such as fusing different types of features and region-based pose estimation which could show the way for future research in this subject.
Resumo:
The genetic and environmental risk factors of vascular cognitive impairment are still largely unknown. This thesis aimed to assess the genetic background of two clinically similar familial small vessel diseases (SVD), CADASIL (Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy) and Swedish hMID (hereditary multi-infarct dementia of Swedish type). In the first study, selected genetic modifiers of CADASIL were studied in a homogenous Finnish CADASIL population of 134 patients, all carrying the p.Arg133Cys mutation in NOTCH3. Apolipoprotein E (APOE) genotypes, angiotensinogen (AGT) p.Met268Thr polymorphism and eight NOTCH3 polymorphisms were studied, but no associations between any particular genetic variant and first-ever stroke or migraine were seen. In the second study, smoking, statin medication and physical activity were suggested to be the most profound environmental differences among the monozygotic twins with CADASIL. Swedish hMID was for long misdiagnosed as CADASIL. In the third study, the CADASIL diagnosis in the Swedish hMID family was ruled out on the basis of genetic, radiological and pathological findings, and Swedish hMID was suggested to represent a novel SVD. In the fourth study, the gene defect of Swedish hMID was then sought using whole exome sequencing paired with a linkage analysis. The strongest candidate for the pathogenic mutation was a 3’UTR variant in the COL4A1 gene, but further studies are needed to confirm its functionality. This study provided new information about the genetic background of two inherited SVDs. Profound knowledge about the pathogenic mutations causing familial SVD is also important for correct diagnosis and treatment options.
Resumo:
The dissertation examines the rule of law within the European Union in the theoretical framework of constitutional pluralism. The leading lines of constitutional pluralism are examined with relation to the traditional and prevailing, monistic and hierarchical conceptions on how to perceive legal orders in Europe. The theoretical part offers also historical perspective by highlighting some of the turning points for the Union constitutional legal order in the framework of European integration. The concept of rule of law is examined in legal terms and its meaning to the Union constitutional constellation as a constitutional principle and a common value is observed. The realization of the rule of law at supranational and national level is explored with a view to discover that recent developments in some of the Member States give rise to concern about the viability of the rule of law within the European Union. It is recognized that the inobservance of the rule of law at national level causes a threat to the supranational constitutional legal order. The relationship between the supranational and national legal orders is significant in this respect and therefore particularly the interaction between the Court of Justice of the European Union (hereinafter the ECJ) and the Member States’ (constitutional/supreme) courts takes focus. It is observed that functioning dialogue between the supranational and national courts based on mutual respect and judicial deference is an important prerequisite for the realization of the rule of law within Europe. In order to afford a concrete example, a recent case C-62/14 Gauweiler v Deutscher Bundestag is introduced and analysed in relation to the notorious relationship between the Federal Constitutional Court of Germany and the ECJ. The implications of the ECJ’s decision in Gauweiler v Deutscher Bundestag is assessed with reference to some of the pressing issues of constitutionalism within Europe and some institutional aspects are also brought forward. Lastly, the feasibility of constitutional pluralism as a theoretical setting is measured against the legal reality of today’s Europe and its many constitutions. The hierarchical idea of one ultimate source of power, stemming from the traditional approaches to legal systems, is then assessed with relation to the requirement of the realization of the rule of law within the European Union from the supranational and national point of view.
Resumo:
This study is motivated by the question how resource scarce innovative entrepreneurial companies seek and leverage global resources. This study takes a resource-seeking perspective a step forward and suggests that resources that enable the entrepreneurial internationalisation are largely accrued from the early stages of entrepreneurial life; that is from the innovation development. Consequently, this study seeks to explain how innovation and internationalisation processes are interrelated in the entrepreneurial internationalisation. This main objective is approached through three research questions, (1) What role do inter-organisational relationships in innovation have in the entrepreneurial internationalisation process? (2) What kind of inward–outward links do inter-organisational relationships create in the resource-seeking-based entrepreneurial internationalisation process? (3) What kind of capability to collaborate forms in the interaction of inter-organisational relationship deployment? The research design is a mixed methods design that consists of quantitative pilot study and qualitative multiple case study of five entrepreneurial life science companies from Finland and Austria. The findings show that innovation and internationalisation processes are tightly interwoven in pre-internationalisation state. The findings also reveal that the more experienced companies are able to take advantage of complexcross-border inter-organisational relationship structures better than the starting companies. However, very minor evidence was found on inward links translating into outward links in the entrepreneurial internationalisation process, despite the expectation to observe more of these links in the data. Combined intangible-tangible resource-seeking was the most preferred to build links between inward–outward internationalisation but also to develop competence to collaborate. By adopting a resource- instead of market-seeking approach, this study illustrated that internationalisation extends to early stages of innovative companies, and that in high-technology companies’ potentially significant cross-border relationships have started to form long before incorporation. Therefore, these observations justified the firmer inclusion of pre-company history in innovative entrepreneurship studies. The study offers a conceptualisation of entrepreneurial internationalisation that is perceived as a process. The main theoretical contributions are in the areas of international entrepreneurship and in the behavioural process studies of entrepreneurial internationalisation and resource-based internationalisation. The inclusion of the innovation-based discussion, namely the innovation process, in the internationalisation process theories has clearly contributed to the understanding of entrepreneurial internationalisation in the context of international entrepreneurship. Innovation development is a central act of entrepreneurial companies, and neglecting innovation process investigation from entrepreneurial internationalisation leaves potentially influential mechanisms unexplored.
Resumo:
The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.
Resumo:
Wind power is a rapidly developing, low-emission form of energy production. In Fin-land, the official objective is to increase wind power capacity from the current 1 005 MW up to 3 500–4 000 MW by 2025. By the end of April 2015, the total capacity of all wind power project being planned in Finland had surpassed 11 000 MW. As the amount of projects in Finland is record high, an increasing amount of infrastructure is also being planned and constructed. Traditionally, these planning operations are conducted using manual and labor-intensive work methods that are prone to subjectivity. This study introduces a GIS-based methodology for determining optimal paths to sup-port the planning of onshore wind park infrastructure alignment in Nordanå-Lövböle wind park located on the island of Kemiönsaari in Southwest Finland. The presented methodology utilizes a least-cost path (LCP) algorithm for searching of optimal paths within a high resolution real-world terrain dataset derived from airborne lidar scannings. In addition, planning data is used to provide a realistic planning framework for the anal-ysis. In order to produce realistic results, the physiographic and planning datasets are standardized and weighted according to qualitative suitability assessments by utilizing methods and practices offered by multi-criteria evaluation (MCE). The results are pre-sented as scenarios to correspond various different planning objectives. Finally, the methodology is documented by using tools of Business Process Management (BPM). The results show that the presented methodology can be effectively used to search and identify extensive, 20 to 35 kilometers long networks of paths that correspond to certain optimization objectives in the study area. The utilization of high-resolution terrain data produces a more objective and more detailed path alignment plan. This study demon-strates that the presented methodology can be practically applied to support a wind power infrastructure alignment planning process. The six-phase structure of the method-ology allows straightforward incorporation of different optimization objectives. The methodology responds well to combining quantitative and qualitative data. Additional-ly, the careful documentation presents an example of how the methodology can be eval-uated and developed as a business process. This thesis also shows that more emphasis on the research of algorithm-based, more objective methods for the planning of infrastruc-ture alignment is desirable, as technological development has only recently started to realize the potential of these computational methods.
Resumo:
"We teach who we are" (Palmer, 1998, p. 2). This simple, yet profound, statement was the catalyst that began my thesis journey. Using a combination of self-study and participant narratives, Palmer's idea was explored as search for authenticity. The self-study component of this narrative was enhanced by the stories of two other teachers, both women. I chose to use narrative methodology to uncover and discover the relationship between the personal and professional lives of being a teacher. Do teachers express themselves daily in their classrooms? Do any lessons from the classroom translate into teachers' personal lives? The themes of reflection, authenticity, truth, and professional development thread themselves throughout this narrative study. In order to be true to myself as a teacher/researcher, arts-based interpretations accompany my own and each participant's profile. Our conversations about our pasts, our growth as teachers and journeys as individuals were captured in poetry and photographic mosaics. Through rich and detailed stories we explored who we are as teachers and how we became this way. The symbiotic relationship between our personal and professional lives was illustrated by tales of bravery, self-discovery, and reflection. The revelations uncovered illustrate the powerful role our past plays in shaping the present and potentially the friture. It may seem indulgent to spend time exploring who we are as teachers in a time that is increasingly focused on improving student test scores. Yet, the truth remains that, "Knowing myself is as crucial to good teaching as knowing my students and my subject" (Palmer, 1998, p. 2).