980 resultados para Task-Oriented Methodology
Resumo:
BACKGROUND: The proportion of surgery performed as a day case varies greatly between countries. Low rates suggest a large growth potential in many countries. Measuring the potential development of one day surgery should be grounded on a comprehensive list of eligible procedures, based on a priori criteria, independent of local practices. We propose an algorithmic method, using only routinely available hospital data to identify surgical hospitalizations that could have been performed as one day treatment. METHODS: Moving inpatient surgery to one day surgery was considered feasible if at least one surgical intervention was eligible for one day surgery and if none of the following criteria were present: intervention or affection requiring an inpatient stay, patient transferred or died, and length of stay greater than four days. The eligibility of a procedure to be treated as a day case was mainly established on three a priori criteria: surgical access (endoscopic or not), the invasiveness of the procedure and the size of the operated organ. Few overrides of these criteria occurred when procedures were associated with risk of immediate complications, slow physiological recovery or pain treatment requiring hospital infrastructure. The algorithm was applied to a random sample of one million inpatient US stays and more than 600 thousand Swiss inpatient stays, in the year 2002. RESULTS: The validity of our method was demonstrated by the few discrepancies between the a priori criteria based list of eligible procedures, and a state list used for reimbursement purposes, the low proportion of hospitalizations eligible for one day care found in the US sample (4.9 versus 19.4% in the Swiss sample), and the distribution of the elective procedures found eligible in Swiss hospitals, well supported by the literature. There were large variations of the proportion of candidates for one day surgery among elective surgical hospitalizations between Swiss hospitals (3 to 45.3%). CONCLUSION: The proposed approach allows the monitoring of the proportion of inpatient stay candidates for one day surgery. It could be used for infrastructure planning, resources negotiation and the surveillance of appropriate resource utilization.
Resumo:
In October 2011 the Task Force Therapeutic Drug Monitoring of the Association for Neuropsychopharmacology and Pharmacopsychiatry (AGNP) published an update (Pharmacopsychiatry 2011, 44: 195-235) of the first version of the consensus paper on therapeutic drug monitoring (TDM) published in 2004. This article summarizes the essential statements to make them accessible to a wider readership in German speaking countries.
Resumo:
This thesis attempts to find whether scenario planning supports the organizational strategy as a method for addressing uncertainty. The main issues are why, what and how scenario planning fits in organizational strategy and how the process could be supported to make it more effective. The study follows the constructive approach. It starts with examination of competitive advantage and the way that an organization develops strategy and how it addresses the uncertainty in its operational environment. Based on the conducted literature review, scenario methods would seem to provide versatile platform for addressing future uncertainties. The construction is formed by examining the scenario methods and presenting suitable support methods, which results in forming of the theoretical proposition for supporter scenario process. The theoretical framework is tested in laboratory conditions, and the results from the test sessions are used a basis for scenario stories. The process of forming the scenarios and the results are illustrated and presented for scrutiny
Resumo:
Tutkimuksen tarkoituksena oli tutkia johdon näkökulmasta toimintojohtamisen soveltuvuutta Pirkanmaan Osuuskaupan päivittäistavarakaupan johtamiseen. Tavoitteena oli rakentaa esimerkkiyritykselle toimintolähtöinen organisaatiomalli ja tutkia organisaation muutoksen edellytyksiä sekä strategian vaikutusta organisaatiomallin valintaan. Tutkimuksessa käytetään konstruktiivista tutkimustapaa. Tutkimus on teoreettinen ja toiminta-analyyttinen case-tutkimus. Toimintojohtaminen, jonka tavoitteena on liiketoiminnan horisontaalinen ohjaus, ei sovellu päivittäistavarakaupan johtamiseen. Toimintojohtaminen ei lisää Pirkanmaan Osuuskaupan market-kaupan kilpailukykyä, koska sen resurssitarve on suuri ja näin ollen se on kustannustehokkuus-strategian vastainen. Organisaatiomuutos edellyttää erityisesti johdolta strategista ajattelua ja kykyä nähdä nykyisen toimintamallin uhat tulevaisuuden kilpailuympäristössä ja toisaalta nähdä tulevaisuuden menestystekijät, joita toteuttamalla yritys menestyy jatkossakin. Johdon tulee pystyä viestimään henkilöstölle muutoksen välttämättömyys ja kuva tulevaisuuden toimintamallista. Johdon rooli ja tuki organisaatiolle muutosprosessissa on keskeistä. Strategia on keino toteuttaa yrityksen visio, päämäärät ja tavoitteet. Toiminnan organisointi on keskeisin keino strategisten päämäärien ja tavoitteiden toteuttamiseksi.
Resumo:
In Switzerland, organ procurement is well organized at the national-level but transplant outcomes have not been systematically monitored so far. Therefore, a novel project, the Swiss Transplant Cohort Study (STCS), was established. The STCS is a prospective multicentre study, designed as a dynamic cohort, which enrolls all solid organ recipients at the national level. The features of the STCS are a flexible patient-case system that allows capturing all transplant scenarios and collection of patient-specific and allograft-specific data. Beyond comprehensive clinical data, specific focus is directed at psychosocial and behavioral factors, infectious disease development, and bio-banking. Between May 2008 and end of 2011, the six Swiss transplant centers recruited 1,677 patients involving 1,721 transplantations, and a total of 1,800 organs implanted in 15 different transplantation scenarios. 10 % of all patients underwent re-transplantation and 3% had a second transplantation, either in the past or during follow-up. 34% of all kidney allografts originated from living donation. Until the end of 2011 we observed 4,385 infection episodes in our patient population. The STCS showed operative capabilities to collect high-quality data and to adequately reflect the complexity of the post-transplantation process. The STCS represents a promising novel project for comparative effectiveness research in transplantation medicine.
Resumo:
Les approches multimodales dans l'imagerie cérébrale non invasive sont de plus en plus considérées comme un outil indispensable pour la compréhension des différents aspects de la structure et de la fonction cérébrale. Grâce aux progrès des techniques d'acquisition des images de Resonance Magnetique et aux nouveaux outils pour le traitement des données, il est désormais possible de mesurer plusieurs paramètres sensibles aux différentes caractéristiques des tissues cérébraux. Ces progrès permettent, par exemple, d'étudier les substrats anatomiques qui sont à la base des processus cognitifs ou de discerner au niveau purement structurel les phénomènes dégénératifs et développementaux. Cette thèse met en évidence l'importance de l'utilisation d'une approche multimodale pour étudier les différents aspects de la dynamique cérébrale grâce à l'application de cette approche à deux études cliniques: l'évaluation structurelle et fonctionnelle des effets aigus du cannabis fumé chez des consommateurs réguliers et occasionnels, et l'évaluation de l'intégrité de la substance grise et blanche chez des jeunes porteurs de la prémutations du gène FMR1 à risque de développer le FXTAS (Fragile-X Tremor Ataxia Syndrome). Nous avons montré que chez les fumeurs occasionnels de cannabis, même à faible concentration du principal composant psychoactif (THC) dans le sang, la performance lors d'une tâche visuo-motrice est fortement diminuée, et qu'il y a des changements dans l'activité des trois réseaux cérébraux impliqués dans les processus cognitifs: le réseau de saillance, le réseau du contrôle exécutif, et le réseau actif par défaut (Default Mode). Les sujets ne sont pas en mesure de saisir les saillances dans l'environnement et de focaliser leur attention sur la tâche. L'augmentation de la réponse hémodynamique dans le cortex cingulaire antérieur suggère une augmentation de l'activité introspective. Une investigation des ef¬fets au niveau cérébral d'une exposition prolongée au cannabis, montre des changements persistants de la substance grise dans les régions associées à la mémoire et au traitement des émotions. Le niveau d'atrophie dans ces structures corrèle avec la consommation de cannabis au cours des trois mois précédant l'étude. Dans la deuxième étude, nous démontrons des altérations structurelles des décennies avant l'apparition du syndrome FXTAS chez des sujets jeunes, asymptomatiques, et porteurs de la prémutation du gène FMR1. Les modifications trouvées peuvent être liées à deux mécanismes différents. Les altérations dans le réseau moteur du cervelet et dans la fimbria de l'hippocampe, suggèrent un effet développemental de la prémutation. Elles incluent aussi une atrophie de la substance grise du lobule VI du cervelet et l'altération des propriétés tissulaires de la substance blanche des projections afférentes correspondantes aux pédoncules cérébelleux moyens. Les lésions diffuses de la substance blanche cérébrale peu¬vent être un marquer précoce du développement de la maladie, car elles sont liées à un phénomène dégénératif qui précède l'apparition des symptômes du FXTAS. - Multimodal brain imaging is becoming a leading tool for understanding different aspects of brain structure and function. Thanks to the advances in Magnetic Resonance imaging (MRI) acquisition schemes and data processing techniques, it is now possible to measure different parameters sensitive to different tissue characteristics. This allows for example to investigate anatomical substrates underlying cognitive processing, or to disentangle, at a pure structural level degeneration and developmental processes. This thesis highlights the importance of using a multimodal approach for investigating different aspects of brain dynamics by applying this approach to two clinical studies: functional and structural assessment of the acute effects of cannabis smoking in regular and occasional users, and grey and white matter assessment in young FMR1 premutation carriers at risk of developing FXTAS. We demonstrate that in occasional smokers cannabis smoking, even at low concentration of the main psychoactive component (THC) in the blood, strongly decrease subjects' performance on a visuo-motor tracking task, and globally alters the activity of the three brain networks involved in cognitive processing: the Salience, the Control Executive, and the Default Mode networks. Subjects are unable to capture saliences in the environment and to orient attention to the task; the increase in Hemodynamic Response in the Anterior Cingulate Cortex suggests an increase in self-oriented mental activity. A further investigation on long term exposure to cannabis, shows a persistent grey matter modification in brain regions associated with memory and affective processing. The degree of atrophy in these structures also correlates with the estimation of drug use in the three months prior the participation to the study. In the second study we demonstrate structural changes in young asymptomatic premutation carriers decades before the onset of FXTAS that might be related to two different mechanisms. Alteration of the cerebellar motor network and of the hippocampal fimbria/ fornix, may reflect a potential neurodevelopmental effect of the premutation. These include grey matter atrophy in lobule VI and modification of white matter tissue property in the corresponding afferent projections through the Middle Cerebellar Peduncles. Diffuse hemispheric white matter lesions that seem to appear closer to the onset of FXTAS and be related to a neurodegenerative phenomenon may mark the imminent onset of FXTAS.
Resumo:
Qualitative differences in strategy selection during foraging in a partially baited maze were assessed in young and old rats. The baited and non-baited arms were at a fixed position in space and marked by a specific olfactory cue. The senescent rats did more re-entries during the first four-trial block but were more rapid than the young rats in selecting the reinforced arms during the first visits. Dissociation between the olfactory spatial cue reference by rotating the maze revealed that only few old subjects relied on olfactory cues to select the baited arms and the remainder relied mainly on the visuo-spatial cues.
Resumo:
Résumé L'eau est souvent considérée comme une substance ordinaire puisque elle est très commune dans la nature. En fait elle est la plus remarquable de toutes les substances. Sans l'eau la vie sur la terre n'existerait pas. L'eau représente le composant majeur de la cellule vivante, formant typiquement 70 à 95% de la masse cellulaire et elle fournit un environnement à d'innombrables organismes puisque elle couvre 75% de la surface de terre. L'eau est une molécule simple faite de deux atomes d'hydrogène et un atome d'oxygène. Sa petite taille semble en contradiction avec la subtilité de ses propriétés physiques et chimiques. Parmi celles-là, le fait que, au point triple, l'eau liquide est plus dense que la glace est particulièrement remarquable. Malgré son importance particulière dans les sciences de la vie, l'eau est systématiquement éliminée des spécimens biologiques examinés par la microscopie électronique. La raison en est que le haut vide du microscope électronique exige que le spécimen biologique soit solide. Pendant 50 ans la science de la microscopie électronique a adressé ce problème résultant en ce moment en des nombreuses techniques de préparation dont l'usage est courrant. Typiquement ces techniques consistent à fixer l'échantillon (chimiquement ou par congélation), remplacer son contenu d'eau par un plastique doux qui est transformé à un bloc rigide par polymérisation. Le bloc du spécimen est coupé en sections minces (denviron 50 nm) avec un ultramicrotome à température ambiante. En général, ces techniques introduisent plusieurs artefacts, principalement dû à l'enlèvement d'eau. Afin d'éviter ces artefacts, le spécimen peut être congelé, coupé et observé à basse température. Cependant, l'eau liquide cristallise lors de la congélation, résultant en une importante détérioration. Idéalement, l'eau liquide est solidifiée dans un état vitreux. La vitrification consiste à refroidir l'eau si rapidement que les cristaux de glace n'ont pas de temps de se former. Une percée a eu lieu quand la vitrification d'eau pure a été découverte expérimentalement. Cette découverte a ouvert la voie à la cryo-microscopie des suspensions biologiques en film mince vitrifié. Nous avons travaillé pour étendre la technique aux spécimens épais. Pour ce faire les échantillons biologiques doivent être vitrifiés, cryo-coupées en sections vitreuse et observées dans une cryo-microscope électronique. Cette technique, appelée la cryo- microscopie électronique des sections vitrifiées (CEMOVIS), est maintenant considérée comme étant la meilleure façon de conserver l'ultrastructure de tissus et cellules biologiques dans un état très proche de l'état natif. Récemment, cette technique est devenue une méthode pratique fournissant des résultats excellents. Elle a cependant, des limitations importantes, la plus importante d'entre elles est certainement dû aux artefacts de la coupe. Ces artefacts sont la conséquence de la nature du matériel vitreux et le fait que les sections vitreuses ne peuvent pas flotter sur un liquide comme c'est le cas pour les sections en plastique coupées à température ambiante. Le but de ce travail a été d'améliorer notre compréhension du processus de la coupe et des artefacts de la coupe. Nous avons ainsi trouvé des conditions optimales pour minimiser ou empêcher ces artefacts. Un modèle amélioré du processus de coupe et une redéfinitions des artefacts de coupe sont proposés. Les résultats obtenus sous ces conditions sont présentés et comparés aux résultats obtenus avec les méthodes conventionnelles. Abstract Water is often considered to be an ordinary substance since it is transparent, odourless, tasteless and it is very common in nature. As a matter of fact it can be argued that it is the most remarkable of all substances. Without water life on Earth would not exist. Water is the major component of cells, typically forming 70 to 95% of cellular mass and it provides an environment for innumerable organisms to live in, since it covers 75% of Earth surface. Water is a simple molecule made of two hydrogen atoms and one oxygen atom, H2O. The small size of the molecule stands in contrast with its unique physical and chemical properties. Among those the fact that, at the triple point, liquid water is denser than ice is especially remarkable. Despite its special importance in life science, water is systematically removed from biological specimens investigated by electron microscopy. This is because the high vacuum of the electron microscope requires that the biological specimen is observed in dry conditions. For 50 years the science of electron microscopy has addressed this problem resulting in numerous preparation techniques, presently in routine use. Typically these techniques consist in fixing the sample (chemically or by freezing), replacing its water by plastic which is transformed into rigid block by polymerisation. The block is then cut into thin sections (c. 50 nm) with an ultra-microtome at room temperature. Usually, these techniques introduce several artefacts, most of them due to water removal. In order to avoid these artefacts, the specimen can be frozen, cut and observed at low temperature. However, liquid water crystallizes into ice upon freezing, thus causing severe damage. Ideally, liquid water is solidified into a vitreous state. Vitrification consists in solidifying water so rapidly that ice crystals have no time to form. A breakthrough took place when vitrification of pure water was discovered. Since this discovery, the thin film vitrification method is used with success for the observation of biological suspensions of. small particles. Our work was to extend the method to bulk biological samples that have to be vitrified, cryosectioned into vitreous sections and observed in cryo-electron microscope. This technique is called cryo-electron microscopy of vitreous sections (CEMOVIS). It is now believed to be the best way to preserve the ultrastructure of biological tissues and cells very close to the native state for electron microscopic observation. Since recently, CEMOVIS has become a practical method achieving excellent results. It has, however, some sever limitations, the most important of them certainly being due to cutting artefacts. They are the consequence of the nature of vitreous material and the fact that vitreous sections cannot be floated on a liquid as is the case for plastic sections cut at room temperature. The aim of the present work has been to improve our understanding of the cutting process and of cutting artefacts, thus finding optimal conditions to minimise or prevent these artefacts. An improved model of the cutting process and redefinitions of cutting artefacts are proposed. Results obtained with CEMOVIS under these conditions are presented and compared with results obtained with conventional methods.
Resumo:
ABSTRACT: Massive synaptic pruning following over-growth is a general feature of mammalian brain maturation. Pruning starts near time of birth and is completed by time of sexual maturation. Trigger signals able to induce synaptic pruning could be related to dynamic functions that depend on the timing of action potentials. Spike-timing-dependent synaptic plasticity (STDP) is a change in the synaptic strength based on the ordering of pre- and postsynaptic spikes. The relation between synaptic efficacy and synaptic pruning suggests that the weak synapses may be modified and removed through competitive "learning" rules. This plasticity rule might produce the strengthening of the connections among neurons that belong to cell assemblies characterized by recurrent patterns of firing. Conversely, the connections that are not recurrently activated might decrease in efficiency and eventually be eliminated. The main goal of our study is to determine whether or not, and under which conditions, such cell assemblies may emerge out of a locally connected random network of integrate-and-fire units distributed on a 2D lattice receiving background noise and content-related input organized in both temporal and spatial dimensions. The originality of our study stands on the relatively large size of the network, 10,000 units, the duration of the experiment, 10E6 time units (one time unit corresponding to the duration of a spike), and the application of an original bio-inspired STDP modification rule compatible with hardware implementation. A first batch of experiments was performed to test that the randomly generated connectivity and the STDP-driven pruning did not show any spurious bias in absence of stimulation. Among other things, a scale factor was approximated to compensate for the network size on the ac¬tivity. Networks were then stimulated with the spatiotemporal patterns. The analysis of the connections remaining at the end of the simulations, as well as the analysis of the time series resulting from the interconnected units activity, suggest that feed-forward circuits emerge from the initially randomly connected networks by pruning. RESUME: L'élagage massif des synapses après une croissance excessive est une phase normale de la ma¬turation du cerveau des mammifères. L'élagage commence peu avant la naissance et est complété avant l'âge de la maturité sexuelle. Les facteurs déclenchants capables d'induire l'élagage des synapses pourraient être liés à des processus dynamiques qui dépendent de la temporalité rela¬tive des potentiels d'actions. La plasticité synaptique à modulation temporelle relative (STDP) correspond à un changement de la force synaptique basé sur l'ordre des décharges pré- et post- synaptiques. La relation entre l'efficacité synaptique et l'élagage des synapses suggère que les synapses les plus faibles pourraient être modifiées et retirées au moyen d'une règle "d'appren¬tissage" faisant intervenir une compétition. Cette règle de plasticité pourrait produire le ren¬forcement des connexions parmi les neurones qui appartiennent à une assemblée de cellules caractérisée par des motifs de décharge récurrents. A l'inverse, les connexions qui ne sont pas activées de façon récurrente pourraient voir leur efficacité diminuée et être finalement éliminées. Le but principal de notre travail est de déterminer s'il serait possible, et dans quelles conditions, que de telles assemblées de cellules émergent d'un réseau d'unités integrate-and¬-fire connectées aléatoirement et distribuées à la surface d'une grille bidimensionnelle recevant à la fois du bruit et des entrées organisées dans les dimensions temporelle et spatiale. L'originalité de notre étude tient dans la taille relativement grande du réseau, 10'000 unités, dans la durée des simulations, 1 million d'unités de temps (une unité de temps correspondant à une milliseconde), et dans l'utilisation d'une règle STDP originale compatible avec une implémentation matérielle. Une première série d'expériences a été effectuée pour tester que la connectivité produite aléatoirement et que l'élagage dirigé par STDP ne produisaient pas de biais en absence de stimu¬lation extérieure. Entre autres choses, un facteur d'échelle a pu être approximé pour compenser l'effet de la variation de la taille du réseau sur son activité. Les réseaux ont ensuite été stimulés avec des motifs spatiotemporels. L'analyse des connexions se maintenant à la fin des simulations, ainsi que l'analyse des séries temporelles résultantes de l'activité des neurones, suggèrent que des circuits feed-forward émergent par l'élagage des réseaux initialement connectés au hasard.
Resumo:
Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.
Resumo:
There is a broad consensus among economists that technologicalchange has been a major contributor to the productivity growth and, hence, to the growth of the material welfare in western industrialized countries at least over the last century. Paradoxically, this issue has not been the focal point of theoretical economics. At the same time, we have witnessed the rise of the importance of technological issues at the strategic management level of business firms. Interestingly, the research has not accurately responded to this challenge either. The tension between the overwhelming empirical evidence of the importance of technology and its relative omission in the research offers a challenging target for a methodological endeavor. This study deals with the question of how different theories cope with technology and explain technological change. The focusis at the firm level and the analysis concentrates on metatheoretical issues, except for the last two chapters, which examine the problems of strategic management of technology. Here the aim is to build a new evolutionary-based theoreticalframework to analyze innovation processes at the firm level. The study consistsof ten chapters. Chapter 1 poses the research problem and contrasts the two basic approaches, neoclassical and evolutionary, to be analyzed. Chapter 2 introduces the methodological framework which is based on the methodology of isolation. Methodological and ontoogical commitments of the rival approaches are revealed and basic questions concerning their ways of theorizing are elaborated. Chapters 3-6 deal with the so-called substantive isolative criteria. The aim is to examine how different approaches cope with such critical issues as inherent uncertainty and complexity of innovative activities (cognitive isolations, chapter 3), theboundedness of rationality of innovating agents (behavioral isolations, chapter4), the multidimensional nature of technology (chapter 5), and governance costsrelated to technology (chapter 6). Chapters 7 and 8 put all these things together and look at the explanatory structures used by the neoclassical and evolutionary approaches in the light of substantive isolations. The last two cpahters of the study utilize the methodological framework and tools to appraise different economics-based candidates in the context of strategic management of technology. The aim is to analyze how different approaches answer the fundamental question: How can firms gain competitive advantages through innovations and how can the rents appropriated from successful innovations be sustained? The last chapter introduces a new evolutionary-based technology management framework. Also the largely omitted issues of entrepreneurship are examined.
Resumo:
This research has been focused at the development of a tuned systematic design methodology, which gives the best performance in a computer aided environment and utilises a cross-technological approach, specially tested with and for laser processed microwave mechanics. A tuned design process scheme is also presented. Because of the currently large production volumes of microwave and radio frequency mechanics even slight improvements of design methodologies or manufacturing technologies would give reasonable possibilities for cost reduction. The typical number of required iteration cycles could be reduced to one fifth of normal. The research area dealing with the methodologies is divided firstly into a function-oriented, a performance-oriented or a manufacturability-oriented product design. Alternatively various approaches can be developed for a customer-oriented, a quality-oriented, a cost-oriented or an organisation-oriented design. However, the real need for improvements is between these two extremes. This means that the effective methodology for the designers should not be too limited (like in the performance-oriented design) or too general (like in the organisation-oriented design), but it should, include the context of the design environment. This is the area where the current research is focused. To test the developed tuned design methodology for laser processing (TDMLP) and the tuned optimising algorithm for laser processing (TOLP), seven different industrial product applications for microwave mechanics have been designed, CAD-modelled and manufactured by using laser in small production series. To verify that the performance of these products meets the required level and to ensure the objectiveness ofthe results extensive laboratory tests were used for all designed prototypes. As an example a Ku-band horn antenna can be laser processed from steel in 2 minutes at the same time obtaining a comparable electrical performance of classical aluminium units or the residual resistance of a laser joint in steel could be limited to 72 milliohmia.
Resumo:
1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.