988 resultados para Capabilities Approach
Resumo:
This paper shows the impact of the atomic capabilities concept to include control-oriented knowledge of linear control systems in the decisions making structure of physical agents. These agents operate in a real environment managing physical objects (e.g. their physical bodies) in coordinated tasks. This approach is presented using an introspective reasoning approach and control theory based on the specific tasks of passing a ball and executing the offside manoeuvre between physical agents in the robotic soccer testbed. Experimental results and conclusions are presented, emphasising the advantages of our approach that improve the multi-agent performance in cooperative systems
Resumo:
A basic prerequisite for in vivo X-ray imaging of the lung is the exact determination of radiation dose. Achieving resolutions of the order of micrometres may become particularly challenging owing to increased dose, which in the worst case can be lethal for the imaged animal model. A framework for linking image quality to radiation dose in order to optimize experimental parameters with respect to dose reduction is presented. The approach may find application for current and future in vivo studies to facilitate proper experiment planning and radiation risk assessment on the one hand and exploit imaging capabilities on the other.
Resumo:
Liquid-chromatography (LC) high-resolution (HR) mass spectrometry (MS) analysis can record HR full scans, a technique of detection that shows comparable selectivity and sensitivity to ion transitions (SRM) performed with triple-quadrupole (TQ)-MS but that allows de facto determination of "all" ions including drug metabolites. This could be of potential utility in in vivo drug metabolism and pharmacovigilance studies in order to have a more comprehensive insight in drug biotransformation profile differences in patients. This simultaneous quantitative and qualitative (Quan/Qual) approach has been tested with 20 patients chronically treated with tamoxifen (TAM). The absolute quantification of TAM and three metabolites in plasma was realized using HR- and TQ-MS and compared. The same LC-HR-MS analysis allowed the identification and relative quantification of 37 additional TAM metabolites. A number of new metabolites were detected in patients' plasma including metabolites identified as didemethyl-trihydroxy-TAM-glucoside and didemethyl-tetrahydroxy-TAM-glucoside conjugates corresponding to TAM with six and seven biotransformation steps, respectively. Multivariate analysis allowed relevant patterns of metabolites and ratios to be associated with TAM administration and CYP2D6 genotype. Two hydroxylated metabolites, α-OH-TAM and 4'-OH-TAM, were newly identified as putative CYP2D6 substrates. The relative quantification was precise (<20 %), and the semiquantitative estimation suggests that metabolite levels are non-negligible. Metabolites could play an important role in drug toxicity, but their impact on drug-related side effects has been partially neglected due to the tremendous effort needed with previous MS technologies. Using present HR-MS, this situation should evolve with the straightforward determination of drug metabolites, enlarging the possibilities in studying inter- and intra-patients drug metabolism variability and related effects.
Resumo:
Työn tavoitteena on selvittää mitkä ovat tärkeimmät aineettomat resurssit, joita tarvitaan teollisuuksien risteyskohdassa tapahtuvassa tuotekehityksessä. Teollisuuksien risteyskohdissa syntyvät tuotteet ovat usein radikaaleja, mikä tekee tuotteista mielenkiintoisia, paljon liiketoimintapotentiaalia tarjoavia. Tämä tutkimus lähestyy tuotekehitystä resurssipohjaisesta näkökulmasta. Myös tietämyspohjaista ja suhdepohjaista näkemystä hyödynnetään korostamaan keskittymistä aineettomiin resursseihin. Tutkimuksessa rakennetaan viitekehys, jossa tutkitaan eri resurssikategorioita. Valitut kategoriat ovat teknologiset, markkinointi-, johtamiseen ja hallinnointiin liittyvät ja suhdepohjaiset resurssit. Empiirisessä osassa tutkitaan kahta uutta tuotekonseptia, jotka ovat syntyneet teollisuuksien risteyskohdissa. Empiirisen osan tavoitteena on määritellä tutkimuksen kohteena olevia alustavia tuotekonsepteja tarkemmin ja selvittää millaisia resursseja näiden toteuttamiseen tarvitaan. Myös tarvittavien resurssien nykytila selvitetään ja pohditaan tulisiko puuttuvia resursseja kehittää yrityksen sisällä vai hankkia ne ulkopuolelta. Tutkimus toteutettiin asiantuntijahaastatteluin. Kahden tapaustutkimuksen perusteella näyttäisi siltä, että suhdepohjaiset resurssit ovat erittäin tärkeitä teollisuuksien risteyskohdissa tapahtuvassa tuotekehityksessä. Myös teknologiset resurssit ovat tärkeitä. Markkinointiresurssien tärkeys riippuu lopullisesta tuotekonseptista, kun taas johtamiseen ja kehittämiseen liittyvät resurssit ovat tärkeitänäiden konseptien luomisessa.
Resumo:
Tutkielma keskittyy lisäämään investointiarviointiprosessien rationaalisuutta strategisten investointien arvioinnissa duopoli- / oligopolimarkkinoilla. Tutkielman päätavoitteena on selvittää kuinka peliteorialla laajennettu reaalioptioperusteinen investointien arviointimenetelmä, laajennettu reaalioptiokehikko, voisi mahdollisesti parantaa analyysien tarkkuutta. Tutkimus lähestyy ongelmaa investoinnin ajoituksen sekä todellisten investoinnin arvoattribuuttien riippuvuuksien kautta. Laajennettu reaalioptiokehikko on investointien analysointi- ja johtamistyökalu, joka tarjoaa osittain rajoitetun (sisältää tällä hetkellä ainoastaan parametrisen ja peliteoreettisen epävarmuuden) optimaalisen arvovälin investoinnin todellisesta arvosta. Kehikossa, ROA kartoittaa mahdolliset strategiset hyödyt tunnistamalla investointiinliittyvät eri optiot ja epävarmuudet, peliteoria korostaa ympäristön luomia paineita investointiin liittyvän epävarmuuden hallitsemisessa. Laajennettu reaalioptiokehikko tarjoaa rationaalisemman arvion strategisen investoinnin arvosta, koska se yhdistää johdonmukaisemmin option toteutuksen ja siten myös optioiden aika-arvon, yrityksen todellisiin rajoitettuihin (rajoituksena muiden markkinatoimijoiden toimet) polkuriippuvaisiin kyvykkyyksiin.
Resumo:
Tutkielman tavoitteena oli selvittää dynaamisten kyvykkyyksien teorian kehittymistä ja nykytilaa. Työssä tarkastellaan myös mahdollisuuksia yhdistää reaalioptioajattelua ja dynaamisten kyvykkyyksien teoriaa. Tutkielma on toteutettu teoreettisena kirjallisuuskatsauksena. Dynaamisten kyvykkyyksien teorian mukaan muuttuvassa toimintaympäristössä yritysten kilpailuetu perustuu kykyyn rakentaa, yhdistää ja muokata resursseja ja kyvykkyyksiä. Yritysten täytyy pystyä löytämään, sulauttamaan ja muuntamaan tietoa voidakseen tunnistaa uusia mahdollisuuksia ja pystyäkseen reagoimaan niihin. Tutkielma tuo esille uusia yhteyksiä dynaamisten kyvykkyyksien teorian ja yritysten käyttäytymisen välillä. Reaalioptioajattelu auttaa tunnistamaan yrityksen rajojen määrittämiseen vaikuttavia tekijöitä. Työssä tehdään ehdotuksia dynaamisten kyvykkyyksien teorian jatkotutkimusta varten.
Resumo:
JXTA is a mature set of open protocols, with morethan 10 years of history, that enable the creation and deployment of peer-to-peer (P2P) networks, allowing the execution of services in a distributed manner. Throughout its lifecycle, ithas slowly evolved in order to appeal a broad set of different applications. Part of this evolution includes providing basic security capabilities in its protocols in order to achieve some degree of message privacy and authentication. However, undersome contexts, more advanced security requirements should be met, such as anonymity. There are several methods to attain anonymity in generic P2P networks. In this paper, we proposehow to adapt a replicated message-based approach to JXTA, by taking advantage of its idiosyncracies and capabilities.
Resumo:
The objective of the dissertation is to examine organizational responses of public actors to customer requirements which drive the transformation of value networks and promote public-private partnership in the electricity distribution industry and elderly care sectors. The research bridges the concept of offering to value networks where capabilities can be acquired for novel product concepts. The research contributes to recent literature, re-examining theories on interactions of customer requirements and supply management. A critical realist case study approach is applied to this abductive the research which directs to describe causalities in the analyzed phenomena. The presented evidence is based on three sources, which are in-depth interviews, archival analysis and the Delphi method. Service provision requires awareness on technology and functionalities of offering. Moreover, service provision includes interactions of multiple partners, which suggests the importance of the co-operative orientation of actors. According to the findings,portfolio management has a key role when intelligent solutions are implemented in public service provision because its concepts involve a variety of resources from multiple suppliers. However, emergent networks are not functional if they lack leaders who have access to the customer interface, have power to steer networks and a capability to build offerings. Public procurement policies were recognized to focus on a narrow scope in which price is a key factor in decisions. In the future, the public sector has to implement technology strategies and portfolio management, which mean longterm platform development and commitment to partnerships. On the other hand, the service providers should also be more aware of offerings into which their products will be integrated in the future. This requires making the customer’s voice in product development and co-operation in order to increase the interconnectivity of products.
Resumo:
Parameter estimation still remains a challenge in many important applications. There is a need to develop methods that utilize achievements in modern computational systems with growing capabilities. Owing to this fact different kinds of Evolutionary Algorithms are becoming an especially perspective field of research. The main aim of this thesis is to explore theoretical aspects of a specific type of Evolutionary Algorithms class, the Differential Evolution (DE) method, and implement this algorithm as codes capable to solve a large range of problems. Matlab, a numerical computing environment provided by MathWorks inc., has been utilized for this purpose. Our implementation empirically demonstrates the benefits of a stochastic optimizers with respect to deterministic optimizers in case of stochastic and chaotic problems. Furthermore, the advanced features of Differential Evolution are discussed as well as taken into account in the Matlab realization. Test "toycase" examples are presented in order to show advantages and disadvantages caused by additional aspects involved in extensions of the basic algorithm. Another aim of this paper is to apply the DE approach to the parameter estimation problem of the system exhibiting chaotic behavior, where the well-known Lorenz system with specific set of parameter values is taken as an example. Finally, the DE approach for estimation of chaotic dynamics is compared to the Ensemble prediction and parameter estimation system (EPPES) approach which was recently proposed as a possible solution for similar problems.
Resumo:
The proliferation of wireless sensor networks in a large spectrum of applications had been spurered by the rapid advances in MEMS(micro-electro mechanical systems )based sensor technology coupled with low power,Low cost digital signal processors and radio frequency circuits.A sensor network is composed of thousands of low cost and portable devices bearing large sensing computing and wireless communication capabilities. This large collection of tiny sensors can form a robust data computing and communication distributed system for automated information gathering and distributed sensing.The main attractive feature is that such a sensor network can be deployed in remote areas.Since the sensor node is battery powered,all the sensor nodes should collaborate together to form a fault tolerant network so as toprovide an efficient utilization of precious network resources like wireless channel,memory and battery capacity.The most crucial constraint is the energy consumption which has become the prime challenge for the design of long lived sensor nodes.
Resumo:
Wind energy has emerged as a major sustainable source of energy.The efficiency of wind power generation by wind mills has improved a lot during the last three decades.There is still further scope for maximising the conversion of wind energy into mechanical energy.In this context,the wind turbine rotor dynamics has great significance.The present work aims at a comprehensive study of the Horizontal Axis Wind Turbine (HAWT) aerodynamics by numerically solving the fluid dynamic equations with the help of a finite-volume Navier-Stokes CFD solver.As a more general goal,the study aims at providing the capabilities of modern numerical techniques for the complex fluid dynamic problems of HAWT.The main purpose is hence to maximize the physics of power extraction by wind turbines.This research demonstrates the potential of an incompressible Navier-Stokes CFD method for the aerodynamic power performance analysis of horizontal axis wind turbine.The National Renewable Energy Laboratory USA-NREL (Technical Report NREL/Cp-500-28589) had carried out an experimental work aimed at the real time performance prediction of horizontal axis wind turbine.In addition to a comparison between the results reported by NREL made and CFD simulations,comparisons are made for the local flow angle at several stations ahead of the wind turbine blades.The comparison has shown that fairly good predictions can be made for pressure distribution and torque.Subsequently, the wind-field effects on the blade aerodynamics,as well as the blade/tower interaction,were investigated.The selected case corresponded to a 12.5 m/s up-wind HAWT at zero degree of yaw angle and a rotational speed of 25 rpm.The results obtained suggest that the present can cope well with the flows encountered around wind turbines.The areodynamic performance of the turbine and the flow details near and off the turbine blades and tower can be analysed using theses results.The aerodynamic performance of airfoils differs from one another.The performance mainly depends on co-efficient of performnace,co-efficient of lift,co-efficient of drag, velocity of fluid and angle of attack.This study shows that the velocity is not constant for all angles of attack of different airfoils.The performance parameters are calculated analytically and are compared with the standardized performance tests.For different angles of ,the velocity stall is determined for the better performance of a system with respect to velocity.The research addresses the effect of surface roughness factor on the blade surface at various sections.The numerical results were found to be in agreement with the experimental data.A relative advantage of the theoretical aerofoil design method is that it allows many different concepts to be explored economically.Such efforts are generally impractical in wind tunnels because of time and money constraints.Thus, the need for a theoretical aerofoil design method is threefold:first for the design of aerofoil that fall outside the range of applicability of existing calalogs:second,for the design of aerofoil that more exactly match the requirements of the intended application:and third,for the economic exploration of many aerofoil concepts.From the results obtained for the different aerofoils,the velocity is not constant for all angles of attack.The results obtained for the aerofoil mainly depend on angle of attack and velocity.The vortex generator technique was meticulously studies with the formulation of the specification for the right angle shaped vortex generators-VG.The results were validated in accordance with the primary analysis phase.The results were found to be in good agreement with the power curve.The introduction of correct size VGs at appropriate locations over the blades of the selected HAWT was found to increase the power generation by about 4%
Resumo:
This paper describes a new statistical, model-based approach to building a contact state observer. The observer uses measurements of the contact force and position, and prior information about the task encoded in a graph, to determine the current location of the robot in the task configuration space. Each node represents what the measurements will look like in a small region of configuration space by storing a predictive, statistical, measurement model. This approach assumes that the measurements are statistically block independent conditioned on knowledge of the model, which is a fairly good model of the actual process. Arcs in the graph represent possible transitions between models. Beam Viterbi search is used to match measurement history against possible paths through the model graph in order to estimate the most likely path for the robot. The resulting approach provides a new decision process that can be use as an observer for event driven manipulation programming. The decision procedure is significantly more robust than simple threshold decisions because the measurement history is used to make decisions. The approach can be used to enhance the capabilities of autonomous assembly machines and in quality control applications.
Resumo:
This paper shows the impact of the atomic capabilities concept to include control-oriented knowledge of linear control systems in the decisions making structure of physical agents. These agents operate in a real environment managing physical objects (e.g. their physical bodies) in coordinated tasks. This approach is presented using an introspective reasoning approach and control theory based on the specific tasks of passing a ball and executing the offside manoeuvre between physical agents in the robotic soccer testbed. Experimental results and conclusions are presented, emphasising the advantages of our approach that improve the multi-agent performance in cooperative systems
Resumo:
Based on an online image archive documenting the construction and history of an early computing company, the fictional story of "Co-Operative Explanatory Capabilities in Organizational Design and Personnel Management” follows the development of an experimental approach to worker productivity into a religious cult. The project investigates the place of creativity in efficiency management and the operation of bureaucratic systems in a post-industrial work environment. The project has spawned a series of collages, featured on the Economic Thought Projects 7" collaboration with Gelbart, The Eleventh Voyage, as well as the film of Co-Operative Explanatory Capabilities in Organizational Design and Personnel Management, which has also been published as a short story in Vertigo of the Modern and on Sacrifice Press.
Resumo:
The paper develops a more precise specification and understanding of the process of national-level knowledge accumulation and absorptive capabilities by applying the reasoning and evidence from the firm-level analysis pioneered by Cohen and Levinthal (1989, 1990). In doing so, we acknowledge that significant cross-border effects due to the role of both inward and outward FDI exist and that assimilation of foreign knowledge is not only confined to catching-up economies but is also carried out by countries at the frontier-sharing phase. We postulate a non-linear relationship between national absorptive capacity and the technological gap, due to the effects of the cumulative nature of the learning process and the increase in complexity of external knowledge as the country approaches the technological frontier. We argue that national absorptive capacity and the accumulation of knowledge stock are simultaneously determined. This implies that different phases of technological development require different strategies. During the catching-up phase, knowledge accumulation occurs predominately through the absorption of trade and/or inward FDI-related R&D spillovers. At the pre-frontier-sharing phase onwards, increases in the knowledge base occur largely through independent knowledge creation and actively accessing foreign-located technological spillovers, inter alia through outward FDI-related R&D, joint ventures and strategic alliances.