976 resultados para Distributed resources
Resumo:
Open source is typically outside of normal commercial software procurement processes.The Challenges.Increasingly diverse and distributed set of development resources.Little/no visibility into the origins of the software.Supply Chain Comparison: Hardware vs Software.Open source has revolutionized the mobile and device landscape, other industries will follow.Supply chain management techniques from hardware are useful for managing software.SPDX A standard format for communicating a software Bill of Materials across the supply chain.Effective management and control requires training, tools, processes and standards.
Resumo:
An Unmanned Aerial Vehicle is a non-piloted airplane designed to operate in dangerous and repetitive situations. With the advent of UAV's civil applications, UAVs are emerging as a valid option in commercial scenarios. If it must be economically viable, the same platform should implement avariety of missions with little reconguration time and overhead.This paper presents a middleware-based architecture specially suited to operate as a exible payload and mission controller in a UAV. The system is composed of low-costcomputing devices connected by network. The functionality is divided into reusable services distributed over a number ofnodes with a middleware managing their lifecycle and communication.Some research has been done in this area; yetit is mainly focused on the control domain and in its realtime operation. Our proposal differs in that we address the implementation of adaptable and reconfigurable unmannedmissions in low-cost and low-resources hardware.
Resumo:
Cooperative transmission can be seen as a "virtual" MIMO system, where themultiple transmit antennas are in fact implemented distributed by the antennas both at the source and the relay terminal. Depending on the system design, diversity/multiplexing gainsare achievable. This design involves the definition of the type of retransmission (incrementalredundancy, repetition coding), the design of the distributed space-time codes, the errorcorrecting scheme, the operation of the relay (decode&forward or amplify&forward) and thenumber of antennas at each terminal. Proposed schemes are evaluated in different conditionsin combination with forward error correcting codes (FEC), both for linear and near-optimum(sphere decoder) receivers, for its possible implementation in downlink high speed packetservices of cellular networks. Results show the benefits of coded cooperation over directtransmission in terms of increased throughput. It is shown that multiplexing gains areobserved even if the mobile station features a single antenna, provided that cell wide reuse of the relay radio resource is possible.
Resumo:
The numerous yeast genome sequences presently available provide a rich source of information for functional as well as evolutionary genomics but unequally cover the large phylogenetic diversity of extant yeasts. We present here the complete sequence of the nuclear genome of the haploid-type strain of Kuraishia capsulata (CBS1993(T)), a nitrate-assimilating Saccharomycetales of uncertain taxonomy, isolated from tunnels of insect larvae underneath coniferous barks and characterized by its copious production of extracellular polysaccharides. The sequence is composed of seven scaffolds, one per chromosome, totaling 11.4 Mb and containing 6,029 protein-coding genes, ~13.5% of which being interrupted by introns. This GC-rich yeast genome (45.7%) appears phylogenetically related with the few other nitrate-assimilating yeasts sequenced so far, Ogataea polymorpha, O. parapolymorpha, and Dekkera bruxellensis, with which it shares a very reduced number of tRNA genes, a novel tRNA sparing strategy, and a common nitrate assimilation cluster, three specific features to this group of yeasts. Centromeres were recognized in GC-poor troughs of each scaffold. The strain bears MAT alpha genes at a single MAT locus and presents a significant degree of conservation with Saccharomyces cerevisiae genes, suggesting that it can perform sexual cycles in nature, although genes involved in meiosis were not all recognized. The complete absence of conservation of synteny between K. capsulata and any other yeast genome described so far, including the three other nitrate-assimilating species, validates the interest of this species for long-range evolutionary genomic studies among Saccharomycotina yeasts.
Resumo:
BACKGROUND: Modern sequencing technologies have massively increased the amount of data available for comparative genomics. Whole-transcriptome shotgun sequencing (RNA-seq) provides a powerful basis for comparative studies. In particular, this approach holds great promise for emerging model species in fields such as evolutionary developmental biology (evo-devo). RESULTS: We have sequenced early embryonic transcriptomes of two non-drosophilid dipteran species: the moth midge Clogmia albipunctata, and the scuttle fly Megaselia abdita. Our analysis includes a third, published, transcriptome for the hoverfly Episyrphus balteatus. These emerging models for comparative developmental studies close an important phylogenetic gap between Drosophila melanogaster and other insect model systems. In this paper, we provide a comparative analysis of early embryonic transcriptomes across species, and use our data for a phylogenomic re-evaluation of dipteran phylogenetic relationships. CONCLUSIONS: We show how comparative transcriptomics can be used to create useful resources for evo-devo, and to investigate phylogenetic relationships. Our results demonstrate that de novo assembly of short (Illumina) reads yields high-quality, high-coverage transcriptomic data sets. We use these data to investigate deep dipteran phylogenetic relationships. Our results, based on a concatenation of 160 orthologous genes, provide support for the traditional view of Clogmia being the sister group of Brachycera (Megaselia, Episyrphus, Drosophila), rather than that of Culicomorpha (which includes mosquitoes and blackflies).
Resumo:
Verkostoitunut kansainvälinen tuotekehitys on tärkeä osa menestystä nykypäivän muuttuvassa yritysmaailmassa. Toimintojen tehostamiseksi myös projektitoiminnot on sopeutettava kansainväliseen toimintaympäristöön. Kilpailukyvyn säilyttämiseksi projektitoimintoja on lisäksi jatkuvasti tehostettava. Yhtenäkeinona nähdään projektioppiminen, jota voidaan edistää monin eri tavoin. Tässätyössä keskitytään projektitiedonhallinnan kehittämisen tuomiin oppimismahdollisuuksiin. Kirjallisuudessa kerrotaan, että projektitiedon jakaminen ja sen hyödyntäminen seuraavissa projekteissa on eräs projektioppimisen edellytyksistä. Tämäon otettu keskeiseksi näkökulmaksi tässä tutkimuksessa. Lisäksi tutkimusalueen rajaamiseksi työ tarkastelee erityisesti projektioppimista kansainvälisten tuotekehitysprojektien välillä. Työn tavoitteena on esitellä keskeisiä projektioppimisen haasteita ja etsiä konkreettinen ratkaisu vastaamaan näihin haasteisiin. Tuotekehitystoiminnot ja kansainvälinen hajautettu projektiorganisaatio kohtaavat lisäksi erityisiä haasteita, kuten tiedon hajautuneisuus, projektihenkilöstön vaihtuvuus, tiedon luottamuksellisuus ja maantieteelliset haasteet (esim. aikavyöhykkeet ja toimipisteen sijainti). Nämä erityishaasteet on otettu huomioon ratkaisua etsittäessä. Haasteisiin päädyttiin vastaamaan tietotekniikkapohjaisella ratkaisulla, joka suunniteltiin erityisesti huomioiden esimerkkiorganisaation tarpeet ja haasteet. Työssä tarkastellaan suunnitellun ratkaisun vaikutusta projektioppimiseen ja kuinka se vastaa havaittuihin haasteisiin. Tuloksissa huomattiin, että projektioppimista tapahtui, vaikka oppimista oli vaikea suoranaisesti huomata tutkimusorganisaation jäsenten keskuudessa. Projektioppimista voidaan kuitenkin sanoa tapahtuvan, jos projektitieto on helposti koko projektiryhmän saatavilla ja se on hyvin järjesteltyä. Muun muassa nämä ehdot täyttyivät. Projektioppiminen nähdään yleisesti haastavana kehitysalueena esimerkkiorganisaatiossa. Suuri osa tietämyksestä on niin sanottua hiljaistatietoa, jota on hankala tai mahdoton saattaa kirjalliseen muotoon. Näin olleen tiedon siirtäminen jää suurelta osin henkilökohtaisen vuorovaikutuksen varaan. Siitä huolimatta projektioppimista on mahdollista kehittää erilaisin toimintamallein ja menetelmin. Kehitys vaatii kuitenkin resursseja, pitkäjänteisyyttä ja aikaa. Monet muutokset voivat vaatia myös organisaatiokulttuurin muutoksen ja vaikuttamista organisaation jäseniin. Motivaatio, positiiviset mielikuvat ja selkeät strategiset tavoitteet luovat vakaan pohjan projektioppimisen kehittämiselle.
Resumo:
The objective of this work was to assess the genetic diversity and population structure of wheat genotypes, to detect significant and stable genetic associations, as well as to evaluate the efficiency of statistical models to identify chromosome regions responsible for the expression of spike-related traits. Eight important spike characteristics were measured during five growing seasons in Serbia. A set of 30 microsatellite markers positioned near important agronomic loci was used to evaluate genetic diversity, resulting in a total of 349 alleles. The marker-trait associations were analyzed using the general linear and mixed linear models. The results obtained for number of allelic variants per locus (11.5), average polymorphic information content value (0.68), and average gene diversity (0.722) showed that the exceptional level of polymorphism in the genotypes is the main requirement for association studies. The population structure estimated by model-based clustering distributed the genotypes into six subpopulations according to log probability of data. Significant and stable associations were detected on chromosomes 1B, 2A, 2B, 2D, and 6D, which explained from 4.7 to 40.7% of total phenotypic variations. The general linear model identified a significantly larger number of marker-trait associations (192) than the mixed linear model (76). The mixed linear model identified nine markers associated to six traits.
Resumo:
Tässä diplomityössä esitellään ohjelmistotestauksen ja verifioinnin yleisiä periaatteita sekä käsitellään tarkemmin älypuhelinohjelmistojen verifiointia. Työssä esitellään myös älypuhelimissa käytettävä Symbian-käyttöjärjestelmä. Työn käytännön osuudessa suunniteltiin ja toteutettiin Symbian-käyttöjärjestelmässä toimiva palvelin, joka tarkkailee ja tallentaa järjestelmäresurssien käyttöä. Verifiointi on tärkeä ja kuluja aiheuttava tehtävä älypuhelinohjelmistojen kehityssyklissä. Kuluja voidaan vähentää automatisoimalla osa verifiointiprosessista. Toteutettu palvelin automatisoijärjestelmäresurssien tarkkailun tallentamalla tietoja niistä tiedostoon testien ajon aikana. Kun testit ajetaan uudestaan, uusia tuloksia vertaillaan lähdetallenteeseen. Jos tulokset eivät ole käyttäjän asettamien virherajojen sisällä, siitä ilmoitetaan käyttäjälle. Virherajojen ja lähdetallenteen määrittäminen saattaa osoittautua vaikeaksi. Kuitenkin, jos ne määritetään sopivasti, palvelin tuottaa hyödyllistä tietoa poikkeamista järjestelmäresurssien kulutuksessa testaajille.
Resumo:
The main goal of this article is to illustrate a way of detecting identity foundations in people using extended autobiographical multi-methodology as a qualitative approach that combines different techniques to study the narrative construction of identity. There are four groups of techniques: 1) in-depth interviews; 2) the revised self-portrait technique; 3) the analysis of artifacts, routines and ways of living, and 4) “psycho-geographical maps.” “Identity foundations” are understood as a set of resources (toolbox) that have been historicallyaccumulated, culturally developed and socially distributed and transmitted, which are essential for defining and presenting oneself. Two examples are provided that illustrate how to use this methodological approach to achieve the aforementioned objective. In conclusion, the study recommends taking into account the explicit and implicit, underlying cultural forces involved in constructing human identity
Resumo:
The nature of client-server architecture implies that some modules are delivered to customers. These publicly distributed commercial software components are under risk, because users (and simultaneously potential malefactors) have physical access to some components of the distributed system. The problem becomes even worse if interpreted programming languages are used for creation of client side modules. The language Java, which was designed to be compiled into platform independent byte-code is not an exception and runs the additional risk. Along with advantages like verifying the code before execution (to ensure that program does not produce some illegal operations)Java has some disadvantages. On a stage of byte-code a java program still contains comments, line numbers and some other instructions, which can be used for reverse-engineering. This Master's thesis focuses on protection of Java code based client-server applications. I present a mixture of methods to protect software from tortious acts. Then I shall realize all the theoretical assumptions in a practice and examine their efficiency in examples of Java code. One of the criteria's to evaluate the system is that my product is used for specialized area of interactive television.
Resumo:
This paper presents a research concerning the conversion of non-accessible web pages containing mathematical formulae into accessible versions through an OCR (Optical Character Recognition) tool. The objective of this research is twofold. First, to establish criteria for evaluating the potential accessibility of mathematical web sites, i.e. the feasibility of converting non-accessible (non-MathML) math sites into accessible ones (Math-ML). Second, to propose a data model and a mechanism to publish evaluation results, making them available to the educational community who may use them as a quality measurement for selecting learning material.Results show that the conversion using OCR tools is not viable for math web pages mainly due to two reasons: many of these pages are designed to be interactive, making difficult, if not almost impossible, a correct conversion; formula (either images or text) have been written without taking into account standards of math writing, as a consequence OCR tools do not properly recognize math symbols and expressions. In spite of these results, we think the proposed methodology to create and publish evaluation reports may be rather useful in other accessibility assessment scenarios.
Resumo:
Forecasting coal resources and reserves is critical for coal mine development. Thickness maps are commonly used for assessing coal resources and reserves; however they are limited for capturing coal splitting effects in thick and heterogeneous coal zones. As an alternative, three-dimensional geostatistical methods are used to populate facies distributionwithin a densely drilled heterogeneous coal zone in the As Pontes Basin (NWSpain). Coal distribution in this zone is mainly characterized by coal-dominated areas in the central parts of the basin interfingering with terrigenous-dominated alluvial fan zones at the margins. The three-dimensional models obtained are applied to forecast coal resources and reserves. Predictions using subsets of the entire dataset are also generated to understand the performance of methods under limited data constraints. Three-dimensional facies interpolation methods tend to overestimate coal resources and reserves due to interpolation smoothing. Facies simulation methods yield similar resource predictions than conventional thickness map approximations. Reserves predicted by facies simulation methods are mainly influenced by: a) the specific coal proportion threshold used to determine if a block can be recovered or not, and b) the capability of the modelling strategy to reproduce areal trends in coal proportions and splitting between coal-dominated and terrigenousdominated areas of the basin. Reserves predictions differ between the simulation methods, even with dense conditioning datasets. Simulation methods can be ranked according to the correlation of their outputs with predictions from the directly interpolated coal proportion maps: a) with low-density datasets sequential indicator simulation with trends yields the best correlation, b) with high-density datasets sequential indicator simulation with post-processing yields the best correlation, because the areal trends are provided implicitly by the dense conditioning data.
Resumo:
In lentic water bodies, such as lakes, the water temperature near the surface typically increases during the day, and decreases during the night as a consequence of the diurnal radiative forcing (solar and infrared radiation). These temperature variations penetrate vertically into the water, transported mainly by heat conduction enhanced by eddy diffusion, which may vary due to atmospheric conditions, surface wave breaking, and internal dynamics of the water body. These two processes can be described in terms of an effective thermal diffusivity, which can be experimentally estimated. However, the transparency of the water (depending on turbidity) also allows solar radiation to penetrate below the surface into the water body, where it is locally absorbed (either by the water or by the deployed sensors). This process makes the estimation of effective thermal diffusivity from experimental water temperature profiles more difficult. In this study, we analyze water temperature profiles in a lake with the aim of showing that assessment of the role played by radiative forcing is necessary to estimate the effective thermal diffusivity. To this end we investigate diurnal water temperature fluctuations with depth. We try to quantify the effect of locally absorbed radiation and assess the impact of atmospheric conditions (wind speed, net radiation) on the estimation of the thermal diffusivity. The whole analysis is based on the results of fiber optic distributed temperature sensing, which allows unprecedented high spatial resolution measurements (∼4 mm) of the temperature profile in the water and near the water surface.
Resumo:
We propose a new approach and related indicators for globally distributed software support and development based on a 3-year process improvement project in a globally distributed engineering company. The company develops, delivers and supports a complex software system with tailored hardware components and unique end-customer installations. By applying the domain knowledge from operations management on lead time reduction and its multiple benefits to process performance, the workflows of globally distributed software development and multitier support processes were measured and monitored throughout the company. The results show that the global end-to-end process visibility and centrally managed reporting at all levels of the organization catalyzed a change process toward significantly better performance. Due to the new performance indicators based on lead times and their variation with fixed control procedures, the case company was able to report faster bug-fixing cycle times, improved response times and generally better customer satisfaction in its global operations. In all, lead times to implement new features and to respond to customer issues and requests were reduced by 50%.
Resumo:
Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange