906 resultados para Model driven architecture (MDA) initiative
Resumo:
Spiral chemical waves subjected to a spatiotemporal random excitability are experimentally and numerically investigated in relation to the light-sensitive Belousov-Zhabotinsky reaction. Brownian motion is identified and characterized by an effective diffusion coefficient which shows a rather complex dependence on the time and length scales of the noise relative to those of the spiral. A kinematically based model is proposed whose results are in good qualitative agreement with experiments and numerics.
Resumo:
Critical exponents of the infinitely slowly driven Zhang model of self-organized criticality are computed for d=2 and 3, with particular emphasis devoted to the various roughening exponents. Besides confirming recent estimates of some exponents, new quantities are monitored, and their critical exponents computed. Among other results, it is shown that the three-dimensional exponents do not coincide with the Bak-Tang-Wiesenfeld [Phys. Rev. Lett. 59, 381 (1987); Phys. Rev. A 38, 364 (1988)] (Abelian) model, and that the dynamical exponent as computed from the correlation length and from the roughness of the energy profile do not necessarily coincide, as is usually implicitly assumed. An explanation for this is provided. The possibility of comparing these results with those obtained from renormalization group arguments is also briefly addressed.
Resumo:
Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.
Resumo:
Enterprise Architecture (EA), which has been approached by both academia and industry, is considered comprising not only architectural representations, but also principles guiding architecture's design and evolution. Even though the concept of EA principles has been defined as the integral part of EA, the number of publications on this subject is very limited and only a few organizations use EA principles to manage their EA endeavors. In order to critically assess the current state of research and identify research gaps in EA principles, we focus on four general aspects of theoretical contributions in IS. By applying these aspects to EA principles, we outline future research directions in EA principles nature, adoption, practices, and impact.
Resumo:
PURPOSE: Abdominal aortic aneurysms (AAAs) expand because of aortic wall destruction. Enrichment in Vascular Smooth Muscle Cells (VSMCs) stabilizes expanding AAAs in rats. Mesenchymal Stem Cells (MSCs) can differentiate into VSMCs. We have tested the hypothesis that bone marrow-derived MSCs (BM-MSCs) stabilizes AAAs in a rat model. MATERIAL AND METHODS: Rat Fischer 344 BM-MSCs were isolated by plastic adhesion and seeded endovascularly in experimental AAAs using xenograft obtained from guinea pig. Culture medium without cells was used as control group. The main criteria was the variation of the aortic diameter at one week and four weeks. We evaluated the impact of cells seeding on inflammatory response by immunohistochemistry combined with RT-PCR on MMP9 and TIMP1 at one week. We evaluated the healing process by immunohistochemistry at 4 weeks. RESULTS: The endovascular seeding of BM-MSCs decreased AAA diameter expansion more powerfully than VSMCs or culture medium infusion (6.5% ± 9.7, 25.5% ± 17.2 and 53.4% ± 14.4; p = .007, respectively). This result was sustained at 4 weeks. BM-MSCs decreased expression of MMP-9 and infiltration by macrophages (4.7 ± 2.3 vs. 14.6 ± 6.4 mm(2) respectively; p = .015), increased Tissue Inhibitor Metallo Proteinase-1 (TIMP-1), compared to culture medium infusion. BM-MSCs induced formation of a neo-aortic tissue rich in SM-alpha active positive cells (22.2 ± 2.7 vs. 115.6 ± 30.4 cells/surface units, p = .007) surrounded by a dense collagen and elastin network covered by luminal endothelial cells. CONCLUSIONS: We have shown in this rat model of AAA that BM-MSCs exert a specialized function in arterial regeneration that transcends that of mature mesenchymal cells. Our observation identifies a population of cells easy to isolate and to expand for therapeutic interventions based on catheter-driven cell therapy.
Resumo:
There is a lack of dedicated tools for business model design at a strategic level. However, in today's economic world the need to be able to quickly reinvent a company's business model is essential to stay competitive. This research focused on identifying the functionalities that are necessary in a computer-aided design (CAD) tool for the design of business models in a strategic context. Using design science research methodology a series of techniques and prototypes have been designed and evaluated to offer solutions to the problem. The work is a collection of articles which can be grouped into three parts: First establishing the context of how the Business Model Canvas (BMC) is used to design business models and explore the way in which CAD can contribute to the design activity. The second part extends on this by proposing new technics and tools which support elicitation, evaluation (assessment) and evolution of business models design with CAD. This includes features such as multi-color tagging to easily connect elements, rules to validate coherence of business models and features that are adapted to the correct business model proficiency level of its users. A new way to describe and visualize multiple versions of a business model and thereby help in addressing the business model as a dynamic object was also researched. The third part explores extensions to the business model canvas such as an intermediary model which helps IT alignment by connecting business model and enterprise architecture. And a business model pattern for privacy in a mobile environment, using privacy as a key value proposition. The prototyped techniques and proposition for using CAD tools in business model modeling will allow commercial CAD developers to create tools that are better suited to the needs of practitioners.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
For patients with chronic lung diseases, such as chronic obstructive pulmonary disease (COPD), exacerbations are life-threatening events causing acute respiratory distress that can even lead to hospitalization and death. Although a great deal of effort has been put into research of exacerbations and potential treatment options, the exact underlying mechanisms are yet to be deciphered and no therapy that effectively targets the excessive inflammation is available. In this study, we report that interleukin-1β (IL-1β) and interleukin-17A (IL-17A) are key mediators of neutrophilic inflammation in influenza-induced exacerbations of chronic lung inflammation. Using a mouse model of disease, our data shows a role for IL-1β in mediating lung dysfunction, and in driving neutrophilic inflammation during the whole phase of viral infection. We further report a role for IL-17A as a mediator of IL-1β induced neutrophilia at early time points during influenza-induced exacerbations. Blocking of IL-17A or IL-1 resulted in a significant abrogation of neutrophil recruitment to the airways in the initial phase of infection or at the peak of viral replication, respectively. Therefore, IL-17A and IL-1β are potential targets for therapeutic treatment of viral exacerbations of chronic lung inflammation.
Resumo:
The continuous production of vascular tissues through secondary growth results in radial thickening of plant organs and is pivotal for various aspects of plant growth and physiology, such as water transport capacity or resistance to mechanical stress. It is driven by the vascular cambium, which produces inward secondary xylem and outward secondary phloem. In the herbaceous plant Arabidopsis thaliana (Arabidopsis), secondary growth occurs in stems, in roots and in the hypocotyl. In the latter, radial growth is most prominent and not obscured by parallel ongoing elongation growth. Moreover, its progression is reminiscent of the secondary growth mode of tree trunks. Thus, the Arabidopsis hypocotyl is a very good model to study basic molecular mechanisms of secondary growth. Genetic approaches have succeeded in the identification of various factors, including peptides, receptors, transcription factors and hormones, which appear to participate in a complex network that controls radial growth. Many of these players are conserved between herbaceous and woody plants. In this review, we will focus on what is known about molecular mechanisms and regulators of vascular secondary growth in the Arabidopsis hypocotyl.
Resumo:
Recent single-cell studies in monkeys (Romo et al., 2004) show that the activity of neurons in the ventral premotor cortex covaries with the animal's decisions in a perceptual comparison task regarding the frequency of vibrotactile events. The firing rate response of these neurons was dependent only on the frequency differences between the two applied vibrations, the sign of that difference being the determining factor for correct task performance. We present a biophysically realistic neurodynamical model that can account for the most relevant characteristics of this decision-making-related neural activity. One of the nontrivial predictions of this model is that Weber's law will underlie the perceptual discrimination behavior. We confirmed this prediction in behavioral tests of vibrotactile discrimination in humans and propose a computational explanation of perceptual discrimination that accounts naturally for the emergence of Weber's law. We conclude that the neurodynamical mechanisms and computational principles underlying the decision-making processes in this perceptual discrimination task are consistent with a fluctuation-driven scenario in a multistable regime.
Resumo:
We characterize the different morphological phases that occur in a simple one-dimensional model of propagation of innovations among economic agents [X. Guardiola et al., Phys. Rev E 66, 026121 (2002)]. We show that the model can be regarded as a nonequilibrium surface growth model. This allows us to demonstrate the presence of a continuous roughening transition between a flat (system size independent fluctuations) and a rough phase (system size dependent fluctuations). Finite-size scaling studies at the transition strongly suggest that the dynamic critical transition does not belong to directed percolation and, in fact, critical exponents do not seem to fit in any of the known universality classes of nonequilibrium phase transitions. Finally, we present an explanation for the occurrence of the roughening transition and argue that avalanche driven dynamics is responsible for the novel critical behavior.
Resumo:
As part of a European initiative (EuroVacc), we report the design, construction, and immunogenicity of two HIV-1 vaccine candidates based on a clade C virus strain (CN54) representing the current major epidemic in Asia and parts of Africa. Open reading frames encoding an artificial 160-kDa GagPolNef (GPN) polyprotein and the external glycoprotein gp120 were fully RNA and codon optimized. A DNA vaccine (DNA-GPN and DNA-gp120, referred to as DNA-C), and a replication-deficient vaccinia virus encoding both reading frames (NYVAC-C), were assessed regarding immunogenicity in Balb/C mice. The intramuscular administration of both plasmid DNA constructs, followed by two booster DNA immunizations, induced substantial T-cell responses against both antigens as well as Env-specific antibodies. Whereas low doses of NYVAC-C failed to induce specific CTL or antibodies, high doses generated cellular as well as humoral immune responses, but these did not reach the levels seen following DNA vaccination. The most potent immune responses were detectable using prime:boost protocols, regardless of whether DNA-C or NYVAC-C was used as the priming or boosting agent. These preclinical findings revealed the immunogenic response triggered by DNA-C and its enhancement by combining it with NYVAC-C, thus complementing the macaque preclinical and human phase I clinical studies of EuroVacc.
Resumo:
The failure of current strategies to provide an explanation for controversial findings on the pattern of pathophysiological changes in Alzheimer's Disease (AD) motivates the necessity to develop new integrative approaches based on multi-modal neuroimaging data that captures various aspects of disease pathology. Previous studies using [18F]fluorodeoxyglucose positron emission tomography (FDG-PET) and structural magnetic resonance imaging (sMRI) report controversial results about time-line, spatial extent and magnitude of glucose hypometabolism and atrophy in AD that depend on clinical and demographic characteristics of the studied populations. Here, we provide and validate at a group level a generative anatomical model of glucose hypo-metabolism and atrophy progression in AD based on FDG-PET and sMRI data of 80 patients and 79 healthy controls to describe expected age and symptom severity related changes in AD relative to a baseline provided by healthy aging. We demonstrate a high level of anatomical accuracy for both modalities yielding strongly age- and symptom-severity- dependant glucose hypometabolism in temporal, parietal and precuneal regions and a more extensive network of atrophy in hippocampal, temporal, parietal, occipital and posterior caudate regions. The model suggests greater and more consistent changes in FDG-PET compared to sMRI at earlier and the inversion of this pattern at more advanced AD stages. Our model describes, integrates and predicts characteristic patterns of AD related pathology, uncontaminated by normal age effects, derived from multi-modal data. It further provides an integrative explanation for findings suggesting a dissociation between early- and late-onset AD. The generative model offers a basis for further development of individualized biomarkers allowing accurate early diagnosis and treatment evaluation.
Resumo:
The penetration of PKI technology in the market is moving slowly due to interoperability concerns. Main causes are not technical but political and social since there is no trust development model that appropriately deals with multidomain PKIs. We propose a new architecture that on one hand considers that trust is not an homogeneous property but tied to a particular relation, and on the other hand, trust management must be performed through specialized entities that can evaluate its risks and threads. The model is based on trust certificate lists that allows users to hold a personalized trust view without having to get involved in technical details. The model dynamically adapts tothe context changes thanks to a new certificate extension, we have called TrustProviderLink (TPL).
Resumo:
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.