976 resultados para location-allocation problem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hydrological and biogeochemical processes that operate in catchments influence the ecological quality of freshwater systems through delivery of fine sediment, nutrients and organic matter. Most models that seek to characterise the delivery of diffuse pollutants from land to water are reductionist. The multitude of processes that are parameterised in such models to ensure generic applicability make them complex and difficult to test on available data. Here, we outline an alternative - data-driven - inverse approach. We apply SCIMAP, a parsimonious risk based model that has an explicit treatment of hydrological connectivity. we take a Bayesian approach to the inverse problem of determining the risk that must be assigned to different land uses in a catchment in order to explain the spatial patterns of measured in-stream nutrient concentrations. We apply the model to identify the key sources of nitrogen (N) and phosphorus (P) diffuse pollution risk in eleven UK catchments covering a range of landscapes. The model results show that: 1) some land use generates a consistently high or low risk of diffuse nutrient pollution; but 2) the risks associated with different land uses vary both between catchments and between nutrients; and 3) that the dominant sources of P and N risk in the catchment are often a function of the spatial configuration of land uses. Taken on a case-by-case basis, this type of inverse approach may be used to help prioritise the focus of interventions to reduce diffuse pollution risk for freshwater ecosystems. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé: Le développement rapide de nouvelles technologies comme l'imagerie médicale a permis l'expansion des études sur les fonctions cérébrales. Le rôle principal des études fonctionnelles cérébrales est de comparer l'activation neuronale entre différents individus. Dans ce contexte, la variabilité anatomique de la taille et de la forme du cerveau pose un problème majeur. Les méthodes actuelles permettent les comparaisons interindividuelles par la normalisation des cerveaux en utilisant un cerveau standard. Les cerveaux standards les plus utilisés actuellement sont le cerveau de Talairach et le cerveau de l'Institut Neurologique de Montréal (MNI) (SPM99). Les méthodes de recalage qui utilisent le cerveau de Talairach, ou celui de MNI, ne sont pas suffisamment précises pour superposer les parties plus variables d'un cortex cérébral (p.ex., le néocortex ou la zone perisylvienne), ainsi que les régions qui ont une asymétrie très importante entre les deux hémisphères. Le but de ce projet est d'évaluer une nouvelle technique de traitement d'images basée sur le recalage non-rigide et utilisant les repères anatomiques. Tout d'abord, nous devons identifier et extraire les structures anatomiques (les repères anatomiques) dans le cerveau à déformer et celui de référence. La correspondance entre ces deux jeux de repères nous permet de déterminer en 3D la déformation appropriée. Pour les repères anatomiques, nous utilisons six points de contrôle qui sont situés : un sur le gyrus de Heschl, un sur la zone motrice de la main et le dernier sur la fissure sylvienne, bilatéralement. Evaluation de notre programme de recalage est accomplie sur les images d'IRM et d'IRMf de neuf sujets parmi dix-huit qui ont participés dans une étude précédente de Maeder et al. Le résultat sur les images anatomiques, IRM, montre le déplacement des repères anatomiques du cerveau à déformer à la position des repères anatomiques de cerveau de référence. La distance du cerveau à déformer par rapport au cerveau de référence diminue après le recalage. Le recalage des images fonctionnelles, IRMf, ne montre pas de variation significative. Le petit nombre de repères, six points de contrôle, n'est pas suffisant pour produire les modifications des cartes statistiques. Cette thèse ouvre la voie à une nouvelle technique de recalage du cortex cérébral dont la direction principale est le recalage de plusieurs points représentant un sillon cérébral. Abstract : The fast development of new technologies such as digital medical imaging brought to the expansion of brain functional studies. One of the methodolgical key issue in brain functional studies is to compare neuronal activation between individuals. In this context, the great variability of brain size and shape is a major problem. Current methods allow inter-individual comparisions by means of normalisation of subjects' brains in relation to a standard brain. A largerly used standard brains are the proportional grid of Talairach and Tournoux and the Montreal Neurological Insititute standard brain (SPM99). However, there is a lack of more precise methods for the superposition of more variable portions of the cerebral cortex (e.g, neocrotex and perisyvlian zone) and in brain regions highly asymmetric between the two cerebral hemipsheres (e.g. planum termporale). The aim of this thesis is to evaluate a new image processing technique based on non-linear model-based registration. Contrary to the intensity-based, model-based registration uses spatial and not intensitiy information to fit one image to another. We extract identifiable anatomical features (point landmarks) in both deforming and target images and by their correspondence we determine the appropriate deformation in 3D. As landmarks, we use six control points that are situated: one on the Heschl'y Gyrus, one on the motor hand area, and one on the sylvian fissure, bilaterally. The evaluation of this model-based approach is performed on MRI and fMRI images of nine of eighteen subjects participating in the Maeder et al. study. Results on anatomical, i.e. MRI, images, show the mouvement of the deforming brain control points to the location of the reference brain control points. The distance of the deforming brain to the reference brain is smallest after the registration compared to the distance before the registration. Registration of functional images, i.e fMRI, doesn't show a significant variation. The small number of registration landmarks, i.e. six, is obvious not sufficient to produce significant modification on the fMRI statistical maps. This thesis opens the way to a new computation technique for cortex registration in which the main directions will be improvement of the registation algorithm, using not only one point as landmark, but many points, representing one particular sulcus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the rapid change in today's business environment there are relatively few studies about corporate renewal. This study aims for its part at filling that research gap by studying the concepts of strategy, corporate renewal, innovation and corporate venturing. Its purpose is to enhance our understanding of how established companies operating in dynamic and global environment can benefit from their corporate venturing activities. The theoretical part approaches the research problem in corporate and venture levels. Firstly, it focuses on mapping the determinants of strategy and suggests using industry, location, resources, knowledge, structure and culture, market, technology and business model to assess the environment and using these determinants to optimize speed and magnitude of change.Secondly, it concludes that the choice of innovation strategy is dependent on the type and dimensions of innovation and suggests assessing market, technology, business model as well as novelty and complexity related to each of them for choosing an optimal context for developing innovations further. Thirdly, it directsattention on processes through which corporate renewal takes place. On corporate level these processes are identified as strategy formulation, strategy formation and strategy implementation. On the venture level the renewal processes are identified as learning, leveraging and nesting. The theoretical contribution of this study, the framework of strategic corporate venturing, joins corporate and venture level management issues together and concludes that strategy processes and linking processes are the mechanism through which continuous corporate renewaltakes place. The framework of strategic corporate venturing proposed by this study is a new way to illustrate the role of corporate venturing as a purposefullybuilt, different view of a company's business environment. The empirical part extended the framework by enhancing our understanding of the link between corporate renewal and corporate venturing in its real life environment in three Finnish companies: Metso, Nokia and TeliaSonera. Characterizing companies' environmentwith the determinants of strategy identified in this study provided a structured way to analyze their competitive position and renewal challenges that they arefacing. More importantly the case studies confirmed that a link between corporate renewal and corporate venturing exists and found out that the link is not as straight forward as indicated by the theory. Furthermore, the case studies enhanced the framework by indicating a sequence according to which the processes work. Firstly, the induced strategy processes strategy formulation and strategy implementation set the scene for corporate venturing context and management processes and leave strategy formation for the venture. Only after that can strategies formed by ventures come back to the corporate level - and if found viable in the corporate level be formalized through formulation and implementation. With the help of the framework of strategic corporate venturing the link between corporaterenewal and corporate venturing can be found and managed. The suggested response to the continuous need for change is continuous renewal i.e. institutionalizing corporate renewal in the strategy processes of the company. As far as benefiting from venturing is concerned the answer lies in deliberately managing venturing in a context different to the mainstream businesses and establishing efficientlinking processes to exploit the renewal potential of individual ventures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the European Union, the importance of mobile communications was realized early on. The process of mobile communications becoming ubiquitous has taken time, as the innovation of mobile communications diffused into the society. The aim of this study is to find out how the evolution and spatial patterns of the diffusion of mobile communications within the European Union could be taken into account in forecasting the diffusion process. There is relatively lot of research of innovation diffusion on the individual (micro) andthe country (macro) level, if compared to the territorial level. Territorial orspatial diffusion refers either to the intra-country or inter-country diffusionof an innovation. In both settings, the dif- fusion of a technological innovation has gained scarce attention. This study adds knowledge of the diffusion between countries, focusing especially on the role of location in this process. The main findings of the study are the following: The penetration rates of the European Union member countries have become more even in the period of observation, from the year 1981 to 2000. The common digital GSM system seems to have hastened this process. As to the role of location in the diffusion process, neighboring countries have had similar diffusion processes. They can be grouped into three, the Nordic countries, the central and southern European countries, and the remote southern European countries. The neighborhood effect is also domi- nating in thegravity model which is used for modeling the adoption timing of the countries. The subsequent diffusion within a country, measured by the logistic model in Finland, is af- fected positively by its economic situation, and it seems to level off at some 92 %. Considering the launch of future mobile communications systemsusing a common standard should implicate an equal development between the countries. The launching time should be carefully selected as the diffusion is probably delayed in economic downturns. The location of a country, measured by distance, can be used in forecasting the adoption and diffusion. Fi- nally, the result of penetration rates becoming more even implies that in a relatively homoge- nous set of countries, such as the European Union member countries, the estimated final pene- tration of a single country can be used for approximating the penetration of the others. The estimated eventual penetration of Finland, some 92 %, should thus also be the eventual level for all the European Union countries and for the European Union as a whole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove that there are one-parameter families of planar differential equations for which the center problem has a trivial solution and on the other hand the cyclicity of the weak focus is arbitrarily high. We illustrate this phenomenon in several examples for which this cyclicity is computed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Severe acute refractory respiratory failure is considered a life-threatening situation, with a high mortality of 40 to 60%. When conservative oxygenation methods fail, a lifesaving measure is the introduction of extracorporeal membrane oxygenation (ECMO). Venovenous ECMO (VV-ECMO) is a preferred modality of support for patients with refractory acute respiratory failure. Specifically, bicaval VV-ECMO is a well-recognized and validated therapy, where single or double periphery venous access is used for the insertion of two differently sized cannulas in order to achieve adequate blood oxygenation. Compared to venoarterial ECMO, in VV-ECMO, the rate of complications, such as thrombosis, bleeding, infection and ischemic events, is lower. On the other hand, the size and insertion location is an obstacle to patient mobilization. This is a considerable problem for patients where the time interval for lung recovery and the bridge to the transplantation is prolonged. To address this issue, a dual-lumen, single venovenous cannula was introduced. Here, by insertion of one single catheter in one target vessel, in a majority of cases in the right internal jugular vein, satisfactory oxygenation of the patient is achieved. In this form, the instituted VV-ECMO enables patient mobility, better physical rehabilitation and facilitates pulmonary extubation and toilet. However, relatively early, after the first short-term reports were published, a relatively high complication rate became evident. In the recent literature, the complication rate using actual commercially available double-lumen venovenous cannula ranges between 5 and 30%. These cases were mostly conjoined to the implantation phase or the early postoperative phase and vary between right heart perforation to migration of the cannula. This review focuses on complications allied to commercially available dual-lumen, single, venovenous cannula implantation, pointing out the critical segments of the implantation process and analyzing the structure of the device.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This empirical study consists in an investigation of the effects, on the development of Information Problem Solving (IPS) skills, of a long-term embedded, structured and supported instruction in Secondary Education. Forty secondary students of 7th and 8th grades (13–15 years old) participated in the 2-year IPS instruction designed in this study. Twenty of them participated in the IPS instruction, and the remaining twenty were the control group. All the students were pre- and post-tested in their regular classrooms, and their IPS process and performance were logged by means of screen capture software, to warrant their ecological validity. The IPS constituent skills, the web search sub-skills and the answers given by each participant were analyzed. The main findings of our study suggested that experimental students showed a more expert pattern than the control students regarding the constituent skill ‘defining the problem’ and the following two web search sub-skills: ‘search terms’ typed in a search engine, and ‘selected results’ from a SERP. In addition, scores of task performance were statistically better in experimental students than in control group students. The paper contributes to the discussion of how well-designed and well-embedded scaffolds could be designed in instructional programs in order to guarantee the development and efficiency of the students’ IPS skills by using net information better and participating fully in the global knowledge society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Random problem distributions have played a key role in the study and design of algorithms for constraint satisfaction and Boolean satisfiability, as well as in ourunderstanding of problem hardness, beyond standard worst-case complexity. We consider random problem distributions from a highly structured problem domain that generalizes the Quasigroup Completion problem (QCP) and Quasigroup with Holes (QWH), a widely used domain that captures the structure underlying a range of real-world applications. Our problem domain is also a generalization of the well-known Sudoku puz- zle: we consider Sudoku instances of arbitrary order, with the additional generalization that the block regions can have rectangular shape, in addition to the standard square shape. We evaluate the computational hardness of Generalized Sudoku instances, for different parameter settings. Our experimental hardness results show that we can generate instances that are considerably harder than QCP/QWH instances of the same size. More interestingly, we show the impact of different balancing strategies on problem hardness. We also provide insights into backbone variables in Generalized Sudoku instances and how they correlate to problem hardness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämän diplomityön tavoitteena on selvittää, mitä alueellisia tekijöitä suomalaiset yritykset ottavat huomioon valitessaan sopivaa sijaintia suoralle investoinnille Venäjän sisällä. Muutamia yrityksen sisäisiä tekijöitä käytetään taustamuuttujina selittämään sijaintitekijöiden painotuksissa havaittavia eroja erilaisten yritysten välillä. Venäjän alueita vertaillaan lopuksi painotusten valossa. Työn ensimmäisessä osassa keskitytään suorien ulkomaisten investointien teoreettiseen taustaan. Aiempia tutkimuksia käydään läpi, jotta tekijät, joilla on havaittu olevan vaikutusta investointien sijoittumiseen maan sisällä, saadaan kartoitettua. Työn jälkimmäinen osa perustuu yrityskyselyn avulla kerättyyn empiiriseen aineistoon. Aineiston avulla selvitetään mitä tekijöitä suomalaisyritykset huomioivat sijaintipäätöstä tehdessään. Tulosten valossa on ilmeistä, että alueen markkinapotentiaali on suomalaisyrityksissä tärkein huomioitava tekijä investoinnin sijainnista päätettäessä. Myös infrastruktuuri ja kustannushyödyt vaikuttavat päätökseen. Erityyppisten yritysten painotukset ovat hyvin samanlaisia. Moskova ja Pietari vastaavat Venäjän alueista parhaiten suomalaisyritysten investoinnin sijainnille asettamia kriteerejä.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työn tavoitteena oli kehittää kohdeyrityksen kustannuslaskentaa paremmin muuttuneen toimintaympäristön vaatimuksia vastaavaksi. Ongelmana kohdeyrityksen kustannuslaskennassa on ollut yleiskustannusten kohdistamisen vaikeus. Toimintoperusteisen kustannuslaskennan uskottiin parantavan yleiskustannuksien kohdistamiseen liittyneitä ongelmia. Työn teoriaosan lähdeaineistona käytettiin toimintolaskentaan keskittynyttä kirjallisuutta. Empiirinen osa muodostuu kohdeyrityksen sisäisistä lähteistä ja henkilöstön haastatteluiden perusteella muodostuneista näkemyksistä, sekä rakennetun toimintolaskentamallin tuottamista tuloksista.Työssä rakennetun kustannuslaskentamallin käyttömahdollisuutta myös konsernin muissa yksiköissä pidettiin tärkeänä mallin rakennetta suunniteltaessa. Koko konsernin kattavat yhtenäiset kustannuslaskentaperiaatteet mahdollistavat eri yksiköiden välisten kustannusrakenteiden vertailun.Toimintoperusteisen kustannuslaskennan avulla kohdeyrityksessä on saatu tarkempaa tietoa tuotekohtaisista valmistuskustannuksista. Tuotekohtaisten valmistuskustannusten selvittäminen mahdollistaa myös asiakas- ja asiakasryhmäkohtaisten kannattavuuksien selvittämisen. Toimintolaskennan kehittämisen yhteydessä saatiin kohdeyrityksessä hyödyllistä tietoa myös siitä, mitkä toiminnot aiheuttavat suurimmat kustannukset. Toimintolaskenta paljastaa myös ne toiminnot, joita tulisi ja joita on mahdollista tehostaa.