888 resultados para data-driven simulation
Resumo:
The objective of this thesis is to provide a business model framework that connects customer value to firm resources and explains the change logic of the business model. Strategic supply management and especially dynamic value network management as its scope, the dissertation is based on basic economic theories, transaction cost economics and the resource-based view. The main research question is how the changing customer values should be taken into account when planning business in a networked environment. The main question is divided into questions that form the basic research problems for the separate case studies presented in the five Publications. This research adopts the case study strategy, and the constructive research approach within it. The material consists of data from several Delphi panels and expert workshops, software pilot documents, company financial statements and information on investor relations on the companies’ web sites. The cases used in this study are a mobile multi-player game value network, smart phone and “Skype mobile” services, the business models of AOL, eBay, Google, Amazon and a telecom operator, a virtual city portal business system and a multi-play offering. The main contribution of this dissertation is bridging the gap between firm resources and customer value. This has been done by theorizing the business model concept and connecting it to both the resource-based view and customer value. This thesis contributes to the resource-based view, which deals with customer value and firm resources needed to deliver the value but has a gap in explaining how the customer value changes should be connected to the changes in key resources. This dissertation also provides tools and processes for analyzing the customer value preferences of ICT services, constructing and analyzing business models and business concept innovation and conducting resource analysis.
Resumo:
Liver is unique in its capacity to regenerate in response to injury or tissue loss. Hepatocytes and other liver cells are able to proliferate and repopulate the liver. However, when this response is impaired, the contribution of hepatic progenitors becomes very relevant. Here, we present an update of recent studies on growth factors and cytokine-driven intracellular pathways that govern liver stem/progenitor cell expansion and differentiation, and the relevance of these signals in liver development, regeneration and carcinogenesis. Tyrosine kinase receptor signaling, in particular, c-Met, epidermal growth factor receptors or fibroblast growth factor receptors, contribute to proliferation, survival and differentiation of liver stem/progenitor cells. Different evidence suggests a dual role for the transforming growth factor (TGF)-β signaling pathway in liver stemness and differentiation. On the one hand, TGF-β mediates progression of differentiation from a progenitor stage, but on the other hand, it contributes to the expansion of liver stem cells. Hedgehog family ligands are necessary to promote hepatoblast proliferation but need to be shut off to permit subsequent hepatoblast differentiation. In the same line, the Wnt family and β-catenin/T-cell factor pathway is clearly involved in the maintenance of liver stemness phenotype, and its repression is necessary for liver differentiation during development. Collectively, data indicate that liver stem/progenitor cells follow their own rules and regulations. The same signals that are essential for their activation, expansion and differentiation are good candidates to contribute, under adequate conditions, to the paradigm of transformation from a pro-regenerative to a pro-tumorigenic role. From a clinical perspective, this is a fundamental issue for liver stem/progenitor cell-based therapies.
Resumo:
Connectivity analysis on diffusion MRI data of the whole- brain suffers from distortions caused by the standard echo- planar imaging acquisition strategies. These images show characteristic geometrical deformations and signal destruction that are an important drawback limiting the success of tractography algorithms. Several retrospective correction techniques are readily available. In this work, we use a digital phantom designed for the evaluation of connectivity pipelines. We subject the phantom to a âeurooetheoretically correctâeuro and plausible deformation that resembles the artifact under investigation. We correct data back, with three standard methodologies (namely fieldmap-based, reversed encoding-based, and registration- based). Finally, we rank the methods based on their geometrical accuracy, the dropout compensation, and their impact on the resulting connectivity matrices.
Resumo:
Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a spectrum of scientific disciplines. The corpus of documentation of these resources is fragmented across the Web, with much redundancy, and has lacked a common standard of information. The outcome is that scientists must often struggle to find, understand, compare and use the best resources for the task at hand.Here we present a community-driven curation effort, supported by ELIXIR-the European infrastructure for biological information-that aspires to a comprehensive and consistent registry of information about bioinformatics resources. The sustainable upkeep of this Tools and Data Services Registry is assured by a curation effort driven by and tailored to local needs, and shared amongst a network of engaged partners.As of November 2015, the registry includes 1785 resources, with depositions from 126 individual registrations including 52 institutional providers and 74 individuals. With community support, the registry can become a standard for dissemination of information about bioinformatics resources: we welcome everyone to join us in this common endeavour. The registry is freely available at https://bio.tools.
Resumo:
Background Virtual reality (VR) simulation is increasingly used in surgical disciplines. Since VR simulators measure multiple outcomes, standardized reporting is needed. Methods We present an algorithm for combining multiple VR outcomes into dimension summary measures, which are then integrated into a meaningful total score. We reanalyzed the data of two VR studies applying the algorithm. Results The proposed algorithm was successfully applied to both VR studies. Conclusions The algorithm contributes to standardized and transparent reporting in VR-related research.
Resumo:
Dreaming is a pure form of phenomenality, created by the brain untouched by external stimulation or behavioral activity, yet including a full range of phenomenal contents. Thus, it has been suggested that the dreaming brain could be used as a model system in a biological research program on consciousness (Revonsuo, 2006). In the present thesis, the philosophical view of biological realism is accepted, and thus, dreaming is considered as a natural biological phenomenon, explainable in naturalistic terms. The major theoretical contribution of the present thesis is that it explores dreaming from a multidisciplinary perspective, integrating information from various fields of science, such as dream research, consciousness research, evolutionary psychology, and cognitive neuroscience. Further, it places dreaming into a multilevel framework, and investigates the constitutive, etiological, and contextual explanations for dreaming. Currently, the only theory offering a full multilevel explanation for dreaming, that is, a theory including constitutive, etiological, and contextual level explanations, is the Threat Simulation Theory (TST) (Revonsuo, 2000a; 2000b). The empirical significance of the present thesis lies in the tests conducted to test this specific theory put forth to explain the form, content, and biological function of dreaming. The first step in the empirical testing of the TST was to define exact criteria for what is a ‘threatening event’ in dreams, and then to develop a detailed and reliable content analysis scale with which it is possible to empirically explore and quantify threatening events in dreams. The second step was to seek answers to the following questions derived from the TST: How frequent threatening events are in dreams? What kind of qualities these events have? How threatening events in dreams relate to the most recently encoded or the most salient memory traces of threatening events experienced in waking life? What are the effects of exposure to severe waking life threat on dreams? The results reveal that threatening events are relatively frequent in dreams, and that the simulated threats are realistic. The most common threats include aggression, are targeted mainly against the dream self, and include simulations of relevant and appropriate defensive actions. Further, real threat experiences activate the threat simulation system in a unique manner, and dream content is modulated by the activation of long term episodic memory traces with highest negative saliency. To sum up, most of the predictions of the TST tested in this thesis received considerable support. The TST presents a strong argument that explains the specific design of dreams as threat simulations. The TST also offers a plausible explanation for why dreaming would have been selected for: because dreaming interacted with the environment in such a way that enhanced fitness of ancestral humans. By referring to a single threat simulation mechanism it furthermore manages to explain a wide variety of dream content data that already exists in the literature, and to predict the overall statistical patterns of threat content in different samples of dreams. The TST and the empirical tests conducted to test the theory are a prime example of what a multidisciplinary approach to mental phenomena can accomplish. Thus far, dreaming seems to have always resided in the periphery of science, never regarded worth to be studied by the mainstream. Nevertheless, when brought to the spotlight, the study of dreaming can greatly benefit from ideas in diverse branches of science. Vice versa, knowledge learned from the study of dreaming can be applied in various disciplines. The main contribution of the present thesis lies in putting dreaming back where it belongs, that is, into the spotlight in the cross-road of various disciplines.
Resumo:
A physical model for the simulation of x-ray emission spectra from samples irradiated with kilovolt electron beams is proposed. Inner shell ionization by electron impact is described by means of total cross sections evaluated from an optical-data model. A double differential cross section is proposed for bremsstrahlung emission, which reproduces the radiative stopping powers derived from the partial wave calculations of Kissel, Quarles and Pratt [At. Data Nucl. Data Tables 28, 381 (1983)]. These ionization and radiative cross sections have been introduced into a general-purpose Monte Carlo code, which performs simulation of coupled electron and photon transport for arbitrary materials. To improve the efficiency of the simulation, interaction forcing, a variance reduction technique, has been applied for both ionizing collisions and radiative events. The reliability of simulated x-ray spectra is analyzed by comparing simulation results with electron probe measurements.
Resumo:
Data traffic caused by mobile advertising client software when it is communicating with the network server can be a pain point for many application developers who are considering advertising-funded application distribution, since the cost of the data transfer might scare their users away from using the applications. For the thesis project, a simulation environment was built to mimic the real client-server solution for measuring the data transfer over varying types of connections with different usage scenarios. For optimising data transfer, a few general-purpose compressors and XML-specific compressors were tried for compressing the XML data, and a few protocol optimisations were implemented. For optimising the cost, cache usage was improved and pre-loading was enhanced to use free connections to load the data. The data traffic structure and the various optimisations were analysed, and it was found that the cache usage and pre-loading should be enhanced and that the protocol should be changed, with report aggregation and compression using WBXML or gzip.
Resumo:
Self-sustained time-dependent current oscillations under dc voltage bias have been observed in recent experiments on n-doped semiconductor superlattices with sequential resonant tunneling. The current oscillations are caused by the motion and recycling of the domain wall separating low- and high-electric-field regions of the superlattice, as the analysis of a discrete drift model shows and experimental evidence supports. Numerical simulation shows that different nonlinear dynamical regimes of the domain wall appear when an external microwave signal is superimposed on the dc bias and its driving frequency and driving amplitude vary. On the frequency-amplitude parameter plane, there are regions of entrainment and quasiperiodicity forming Arnold tongues. Chaos is demonstrated to appear at the boundaries of the tongues and in the regions where they overlap. Coexistence of up to four electric-field domains randomly nucleated in space is detected under ac+dc driving.
Resumo:
This thesis introduces a real-time simulation environment based on the multibody simulation approach. The environment consists of components that are used in conventional product development, including computer aided drawing, visualization, dynamic simulation and finite element software architecture, data transfer and haptics. These components are combined to perform as a coupled system on one platform. The environment is used to simulate mobile and industrial machines at different stages of a product life time. Consequently, the demands of the simulated scenarios vary. In this thesis, a real-time simulation environment based on the multibody approach is used to study a reel mechanism of a paper machine and a gantry crane. These case systems are used to demonstrate the usability of the real-time simulation environment for fault detection purposes and in the context of a training simulator. In order to describe the dynamical performance of a mobile or industrial machine, the nonlinear equations of motion must be defined. In this thesis, the dynamical behaviour of machines is modelled using the multibody simulation approach. A multibody system may consist of rigid and flexible bodies which are joined using kinematic joint constraints while force components are used to describe the actuators. The strength of multibody dynamics relies upon its ability to describe nonlinearities arising from wearing of the components, friction, large rotations or contact forces in a systematic manner. For this reason, the interfaces between subsystems such as mechanics, hydraulics and control systems of the mechatronic machine can be defined and analyzed in a straightforward manner.
Resumo:
The purpose of this thesis was to investigate creating and improving category purchasing visibility for corporate procurement by utilizing financial information. This thesis was a part of the global category driven spend analysis project of Konecranes Plc. While creating general understanding for building category driven corporate spend visibility, the IT architecture and needed purchasing parameters for spend analysis were described. In the case part of the study three manufacturing plants of Konecranes Standard Lifting, Heavy Lifting and Services business areas were examined. This included investigating the operative IT system architecture and needed processes for building corporate spend visibility. The key findings of this study were the identification of the needed processes for gathering purchasing data elements while creating corporate spend visibility in fragmented source system environment. As an outcome of the study, roadmap presenting further development areas was introduced for Konecranes.
Resumo:
Fusarium Head Blight (FHB) is a disease of great concern in wheat (Triticum aestivum). Due to its relatively narrow susceptible phase and environmental dependence, the pathosystem is suitable for modeling. In the present work, a mechanistic model for estimating an infection index of FHB was developed. The model is process-based driven by rates, rules and coefficients for estimating the dynamics of flowering, airborne inoculum density and infection frequency. The latter is a function of temperature during an infection event (IE), which is defined based on a combination of daily records of precipitation and mean relative humidity. The daily infection index is the product of the daily proportion of susceptible tissue available, infection frequency and spore cloud density. The model was evaluated with an independent dataset of epidemics recorded in experimental plots (five years and three planting dates) at Passo Fundo, Brazil. Four models that use different factors were tested, and results showed all were able to explain variation for disease incidence and severity. A model that uses a correction factor for extending host susceptibility and daily spore cloud density to account for post-flowering infections was the most accurate explaining 93% of the variation in disease severity and 69% of disease incidence according to regression analysis.
Resumo:
The objective of this master’s thesis was to examine technology-based smart home devices and services. Topic was approached through basic theories, transaction cost theory and resource-based view in order to build basis for this thesis. Conceptual framework was discussed by means of networks, value networks and service systems which provide a useful framework for service development. The needs of the elderly living at home were discussed in order to find out which technology-based services could be used to satisfy the needs. Segmentation and need data collected previously during proactive home visits was exploited and additionally a survey targeted to experts and professionals of social and health care sector was done to verify the needs. Finally, the results of the survey were analyzed using quality function deployment method to figure out the most important and suitable service offerings for the elderly. As a conclusion of analysis, social media and monitoring services are the most useful technology-based services. However, traditional home services will still maintain their necessity too.
Resumo:
BACKGROUND: Simulation techniques are spreading rapidly in medicine. Suc h resources are increasingly concentrated in Simulation Laboratories. The MSRP-USP is structuring such a laboratory and is interested in the prevalence of individual initiatives that could be centralized there. The MSRP-USP currently has five full-curriculum courses in the health sciences: Medicine, Speech Therapy, Physical Therapy, Nutrition, and Occupational Therapy, all consisting of core disciplines. GOAL: To determine the prevalence of simulation techniques in the regular courses at MSRP-USP. METHODS: Coordinators of disciplines in the various courses were interviewed using a specifically designed semi-structured questionnaire, and all the collected data were stored in a dedicated database. The disciplines were grouped according to whether they used (GI) or did not use (GII) simulation resources. RESULTS AND DISCUSSION: 256 disciplines were analyzed, of which only 18.3% used simulation techniques, varying according to course: Medicine (24.7.3%), Occupational Therapy (23.0%), Nutrition (15.9%), Physical Therapy (9.8%), and Speech Therapy (9.1%). Computer simulation programs predominated (42.5%) in all five courses. The resources were provided mainly by MSRP-USP (56.3%), with additional funding coming from other sources based on individual initiatives. The same pattern was observed for maintenance. There was great interest in centralizing the resources in the new Simulation Laboratory in order to facilitate maintenance, but there was concern about training and access to the material. CONCLUSIONS: 1) The MSRP-USP simulation resources show low complexity and are mainly limited to computer programs; 2) Use of simulation varies according to course, and is most prevalent in Medicine; 3) Resources are scattered across several locations, and their acquisition and maintenance depend on individual initiatives rather than central coordination or curricular guidelines
Resumo:
The concept of open innovation has recently gained widespread attention, and is particularly relevant now as many firms endeavouring to implement open innovation, face different sets of challenges associated with managing it. Prior research on open innovation has focused on the internal processes dealing with open innovation implementation and the organizational changes, already taking place or yet required in companies order to succeed in the global open innovation market. Despite the intensive research on open innovation, the question of what influences its adoption by companies in different contexts has not received much attention in studies. To fill this gap, this thesis contribute to the discussion on open innovation influencing factors by bringing in the perspective of environmental impacts, i.e. gathering data on possible sources of external influences, classifying them and testing their systemic impact through conceptual system dynamics simulation model. The insights from data collection and conceptualization in modelling are used to answer the question of how the external environment affects the adoption of open innovation. The thesis research is presented through five research papers reflecting the method triangulation based study (conducted at initial stage as case study, later as quantitative analysis and finally as system dynamics simulation). This multitude of methods was used to collect the possible external influence factors and to assess their impact (on positive/negative scale rather than numerical). The results obtained throughout the thesis research bring valuable insights into understanding of open innovation influencing factors inside a firm’s operating environment, point out the balance required in the system for successful open innovation performance and discover the existence of tipping point of open innovation success when driven by market dynamics and structures. The practical implications on how firms and policy-makers can leverage environment for their potential benefits are offered in the conclusions.