789 resultados para Building Blocks for Creative Practice
Resumo:
As the anniversaries of 2008 tornado’s and floods approach, the Rebuild Iowa Office vision of a safer, stronger and smarter Iowa is coming into sharper focus. While much more remains to be done, hundreds of displaced Iowans and businesses are on the road to recovery and the building blocks for communities coming together. While recovery is a marathon and not a sprint, the work done so far couldn’t have been accomplished without an extensive recovery planning effort and an unprecedented level of cooperation among local, state and federal governments, private citizens, businesses and non-profit organizations, there is a rebirth and recovery underway in Iowa.
Resumo:
Abstract This PhD thesis addresses the issue of alleviating the burden of developing ad hoc applications. Such applications have the particularity of running on mobile devices, communicating in a peer-to-peer manner and implement some proximity-based semantics. A typical example of such application can be a radar application where users see their avatar as well as the avatars of their friends on a map on their mobile phone. Such application become increasingly popular with the advent of the latest generation of mobile smart phones with their impressive computational power, their peer-to-peer communication capabilities and their location detection technology. Unfortunately, the existing programming support for such applications is limited, hence the need to address this issue in order to alleviate their development burden. This thesis specifically tackles this problem by providing several tools for application development support. First, it provides the location-based publish/subscribe service (LPSS), a communication abstraction, which elegantly captures recurrent communication issues and thus allows to dramatically reduce the code complexity. LPSS is implemented in a modular manner in order to be able to target two different network architectures. One pragmatic implementation is aimed at mainstream infrastructure-based mobile networks, where mobile devices can communicate through fixed antennas. The other fully decentralized implementation targets emerging mobile ad hoc networks (MANETs), where no fixed infrastructure is available and communication can only occur in a peer-to-peer fashion. For each of these architectures, various implementation strategies tailored for different application scenarios that can be parametrized at deployment time. Second, this thesis provides two location-based message diffusion protocols, namely 6Shot broadcast and 6Shot multicast, specifically aimed at MANETs and fine tuned to be used as building blocks for LPSS. Finally this thesis proposes Phomo, a phone motion testing tool that allows to test proximity semantics of ad hoc applications without having to move around with mobile devices. These different developing support tools have been packaged in a coherent middleware framework called Pervaho.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
Integrated approaches using different in vitro methods in combination with bioinformatics can (i) increase the success rate and speed of drug development; (ii) improve the accuracy of toxicological risk assessment; and (iii) increase our understanding of disease. Three-dimensional (3D) cell culture models are important building blocks of this strategy which has emerged during the last years. The majority of these models are organotypic, i.e., they aim to reproduce major functions of an organ or organ system. This implies in many cases that more than one cell type forms the 3D structure, and often matrix elements play an important role. This review summarizes the state of the art concerning commonalities of the different models. For instance, the theory of mass transport/metabolite exchange in 3D systems and the special analytical requirements for test endpoints in organotypic cultures are discussed in detail. In the next part, 3D model systems for selected organs--liver, lung, skin, brain--are presented and characterized in dedicated chapters. Also, 3D approaches to the modeling of tumors are presented and discussed. All chapters give a historical background, illustrate the large variety of approaches, and highlight up- and downsides as well as specific requirements. Moreover, they refer to the application in disease modeling, drug discovery and safety assessment. Finally, consensus recommendations indicate a roadmap for the successful implementation of 3D models in routine screening. It is expected that the use of such models will accelerate progress by reducing error rates and wrong predictions from compound testing.
Resumo:
With the advancement of high-throughput sequencing and dramatic increase of available genetic data, statistical modeling has become an essential part in the field of molecular evolution. Statistical modeling results in many interesting discoveries in the field, from detection of highly conserved or diverse regions in a genome to phylogenetic inference of species evolutionary history Among different types of genome sequences, protein coding regions are particularly interesting due to their impact on proteins. The building blocks of proteins, i.e. amino acids, are coded by triples of nucleotides, known as codons. Accordingly, studying the evolution of codons leads to fundamental understanding of how proteins function and evolve. The current codon models can be classified into three principal groups: mechanistic codon models, empirical codon models and hybrid ones. The mechanistic models grasp particular attention due to clarity of their underlying biological assumptions and parameters. However, they suffer from simplified assumptions that are required to overcome the burden of computational complexity. The main assumptions applied to the current mechanistic codon models are (a) double and triple substitutions of nucleotides within codons are negligible, (b) there is no mutation variation among nucleotides of a single codon and (c) assuming HKY nucleotide model is sufficient to capture essence of transition- transversion rates at nucleotide level. In this thesis, I develop a framework of mechanistic codon models, named KCM-based model family framework, based on holding or relaxing the mentioned assumptions. Accordingly, eight different models are proposed from eight combinations of holding or relaxing the assumptions from the simplest one that holds all the assumptions to the most general one that relaxes all of them. The models derived from the proposed framework allow me to investigate the biological plausibility of the three simplified assumptions on real data sets as well as finding the best model that is aligned with the underlying characteristics of the data sets. -- Avec l'avancement de séquençage à haut débit et l'augmentation dramatique des données géné¬tiques disponibles, la modélisation statistique est devenue un élément essentiel dans le domaine dé l'évolution moléculaire. Les résultats de la modélisation statistique dans de nombreuses découvertes intéressantes dans le domaine de la détection, de régions hautement conservées ou diverses dans un génome de l'inférence phylogénétique des espèces histoire évolutive. Parmi les différents types de séquences du génome, les régions codantes de protéines sont particulièrement intéressants en raison de leur impact sur les protéines. Les blocs de construction des protéines, à savoir les acides aminés, sont codés par des triplets de nucléotides, appelés codons. Par conséquent, l'étude de l'évolution des codons mène à la compréhension fondamentale de la façon dont les protéines fonctionnent et évoluent. Les modèles de codons actuels peuvent être classés en trois groupes principaux : les modèles de codons mécanistes, les modèles de codons empiriques et les hybrides. Les modèles mécanistes saisir une attention particulière en raison de la clarté de leurs hypothèses et les paramètres biologiques sous-jacents. Cependant, ils souffrent d'hypothèses simplificatrices qui permettent de surmonter le fardeau de la complexité des calculs. Les principales hypothèses retenues pour les modèles actuels de codons mécanistes sont : a) substitutions doubles et triples de nucleotides dans les codons sont négligeables, b) il n'y a pas de variation de la mutation chez les nucléotides d'un codon unique, et c) en supposant modèle nucléotidique HKY est suffisant pour capturer l'essence de taux de transition transversion au niveau nucléotidique. Dans cette thèse, je poursuis deux objectifs principaux. Le premier objectif est de développer un cadre de modèles de codons mécanistes, nommé cadre KCM-based model family, sur la base de la détention ou de l'assouplissement des hypothèses mentionnées. En conséquence, huit modèles différents sont proposés à partir de huit combinaisons de la détention ou l'assouplissement des hypothèses de la plus simple qui détient toutes les hypothèses à la plus générale qui détend tous. Les modèles dérivés du cadre proposé nous permettent d'enquêter sur la plausibilité biologique des trois hypothèses simplificatrices sur des données réelles ainsi que de trouver le meilleur modèle qui est aligné avec les caractéristiques sous-jacentes des jeux de données. Nos expériences montrent que, dans aucun des jeux de données réelles, tenant les trois hypothèses mentionnées est réaliste. Cela signifie en utilisant des modèles simples qui détiennent ces hypothèses peuvent être trompeuses et les résultats de l'estimation inexacte des paramètres. Le deuxième objectif est de développer un modèle mécaniste de codon généralisée qui détend les trois hypothèses simplificatrices, tandis que d'informatique efficace, en utilisant une opération de matrice appelée produit de Kronecker. Nos expériences montrent que sur un jeux de données choisis au hasard, le modèle proposé de codon mécaniste généralisée surpasse autre modèle de codon par rapport à AICc métrique dans environ la moitié des ensembles de données. En outre, je montre à travers plusieurs expériences que le modèle général proposé est biologiquement plausible.
Resumo:
Iowa’s infrastructure is at a crossroads. A stalwart collection of Iowans dared to consider Iowa’s future economy, the way ahead for future generations, and what infrastructure will be required – and what will not be required – for Iowa to excel. The findings are full of opportunity and challenge. The Infrastructure Plan for Iowa’s Future Economy: A Strategic Direction tells the story and points the way to a strong economy and quality of life for our children and our children’s children. This plan is different from most in that the motivation for its development came not from a requirement to comply or achieve a particular milestone, but, rather, from a recognition that infrastructure, in order to ensure a globally-competitive future economy, must transform from that of past generations. It is not news that all infrastructure – from our rich soil to our bridges – is a challenge to maintain. Prior to the natural disasters of 2008 and the national economic crisis, Iowa was tested in its capacity to sustain not only the infrastructure, but to anticipate future needs. It is imperative that wise investments and planning guide Iowa’s infrastructure development. This plan reflects Iowa’s collective assessment of its infrastructure– buildings, energy, natural resources, telecommunications, and transportation – as, literally, interdependent building blocks of our future. Over the months of planning, more than 200 Iowans participated as part of committees, a task force, or in community meetings. The plan is for all of Iowa, reflected in private, nonprofit, and public interests and involvement throughout the process. Iowa’s success depends on all of Iowa, in all sectors and interests, to engage in its implementation. The Infrastructure Plan for Iowa’s Future Economy: A Strategic Direction sets a clear and bold direction for all stakeholders, making it clear all have a responsibility and an opportunity to contribute to Iowa’s success.
Resumo:
Planarians have been established as an ideal model organism for stem cell research and regeneration. Planarian regeneration and homeostasis require an exquisite balancing act between cell death and cell proliferation as new tissues are made (epimorphosis) and existing tissues remodeled (morphallaxis). Some of the genes and mechanisms that control cell proliferation and pattern formation are known. However, studies about cell death during remodeling are few and far between. We have studied the gene Gtdap-1, the planarian ortholog of human death-associated protein-1 or DAP-1. DAP-1 together with DAP-kinase has been identified as a positive mediator of programmed cell death induced by gamma-interferon in HeLa cells. We have found that the gene functions at the interface between autophagy and cell death in the remodeling of the organism that occurs during regeneration and starvation in sexual and asexual races of planarians. Our data suggest that autophagy of existing cells may be essential to fuel the continued proliferation and differentiation of stem cells by providing the necessary energy and building blocks to neoblasts.
Resumo:
It's usually believed that the idea of applying logical methods to constructivist phenomenalism was, --in general- a result of Russell's originality. In this paper is argued that some important ideas were in fact due to Mach, Moore and Whitehead. According to the author, Russell got from Mach the general idea of epistemology as an analysis of scientific concepts and, specially,the idea of sensations as the building blocks for his logical construction. Moore made Russell believe that only sensations are known in a direct way, and so, the existence of external objects as the cause of our perceptions is only inferred. Moreover, according to the author, Russell's views on sense data -his sensibilia- are also due to Moore. Finally, Russell got from Whitehead the idea of the phenomenical reconstruction as an alternative to the causal theory of perception, and also how the logical construction should be done. The author undertakes also a detailed analysis of some early works of Whitehead not very well known.
Resumo:
Regulatory gene networks contain generic modules, like those involving feedback loops, which are essential for the regulation of many biological functions (Guido et al. in Nature 439:856-860, 2006). We consider a class of self-regulated genes which are the building blocks of many regulatory gene networks, and study the steady-state distribution of the associated Gillespie algorithm by providing efficient numerical algorithms. We also study a regulatory gene network of interest in gene therapy, using mean-field models with time delays. Convergence of the related time-nonhomogeneous Markov chain is established for a class of linear catalytic networks with feedback loops.
Resumo:
Biomolecular structures are assemblies of emergent anisotropic building modules such as uniaxial helices or biaxial strands. We provide an approach to understanding a marginally compact phase of matter that is occupied by proteins and DNA. This phase, which is in some respects analogous to the liquid crystal phase for chain molecules, stabilizes a range of shapes that can be obtained by sequence-independent interactions occurring intra- and intermolecularly between polymeric molecules. We present a singularity-free self-interaction for a tube in the continuum limit and show that this results in the tube being positioned in the marginally compact phase. Our work provides a unified framework for understanding the building blocks of biomolecules.
Resumo:
Despite the important benefits for firms of commercial initiatives on the Internet, e-commerce is still an emerging distribution channel, even in developed countries. Thus, more needs to be known about the mechanisms affecting its development. A large number of works have studied firms¿ e-commerce adoption from technological, intraorganizational, institutional, or other specific perspectives, but there is a need for adequately tested integrative frameworks. Hence, this work proposes and tests a model of firms¿ business-to-consumer (called B2C) e-commerce adoption that is founded on a holistic vision of the phenomenon. With this integrative approach, the authors analyze the joint influence of environmental, technological, and organizational factors; moreover, they evaluate this effect over time. Using various representative Spanish data sets covering the period 1996-2005, the findings demonstrate the suitability of the holistic framework. Likewise, some lessons are learned from the analysis of the key building blocks. In particular, the current study provides evidence for the debate about the effect of competitive pressure, since the findings show that competitive pressure disincentivizes e-commerce adoption in the long term. The results also show that the development or enrichment of the consumers¿ consumption patterns, the technological readiness of the market forces, the firm¿s global scope, and its competences in innovation continuously favor e-commerce adoption.
Resumo:
IP-verkkojen hyvin tunnettu haitta on, että nämä eivät pysty takaamaan tiettyä palvelunlaatua (Quality of Service) lähetetyille paketeille. Seuraavat kaksi tekniikkaa pidetään lupaavimpina palvelunlaadun tarjoamiselle: Differentiated Services (DiffServ) ja palvelunlaatureititys (QoS Routing). DiffServ on varsin uusi IETF:n määrittelemä Internetille tarkoitettu palvelunlaatumekanismi. DiffServ tarjoaa skaalattavaa palvelujen erilaistamista ilman viestintää joka hypyssä ja per-flow –tilan ohjausta. DiffServ on hyvä esimerkki hajautetusta verkkosuunnittelusta. Tämän palvelutasomekanismin tavoite on viestintäjärjestelmien suunnittelun yksinkertaistaminen. Verkkosolmu voidaan rakentaa pienestä hyvin määritellystä rakennuspalikoiden joukosta. Palvelunlaatureititys on reititysmekanismi, jolla liikennereittejä määritellään verkon käytettävissä olevien resurssien pohjalta. Tässä työssä selvitetään uusi palvelunlaatureititystapa, jota kutsutaan yksinkertaiseksi monitiereititykseksi (Simple Multipath Routing). Tämän työn tarkoitus on suunnitella palvelunlaatuohjain DiffServille. Tässä työssä ehdotettu palvelunlaatuohjain on pyrkimys yhdistää DiffServ ja palvelunlaatureititysmekanismeja. Työn kokeellinen osuus keskittyy erityisesti palvelunlaatureititysalgoritmeihin.
Resumo:
Varastojen vähentämiseksi ehdotetaan usein eräkokojen pienentämistä. Tämä sopii sellaiseen toimintaympäristöön, jossa voidaan ostaa nimikkeitä oman valmistuksen ja myynnin tahdissa. Kun kiihkeärytmisessä elektroniikkabisneksessä alihankkijoilta ja varustetoimittajilta sitovasti tilataan hyvin lyhyellä toimitusajalla, ei oston eräkokojen pienentäminen kevennä materiaalivirtaa. Täytyy ottaa löysät pois hankintaketjun toimitusajoista. Monessa toimitusketjussa jalostava työ vaatii vain murto-osan toimitusajasta, ja loppuaika kuluu varastoissa odotteluun, kuljettamiseen ja moninkertaiseen tarkastamiseen. Hukka-ajan leikkaamisen ohessa haetaan muita kustannussäästöjä. Korkeiden työvoimakustannusten rasittama teollisuus on pyrkinyt teknisillä ratkaisuilla korvaamaan palkkatyövoiman erilaisilla mekaanisilla järjestelmillä. Pisimmällä ollaan pienerä- ja yksittäisvalmistusta harjoittavissa konepajoissa, joissa on investoitu FM-järjestelmiin. Diplomityössä on esitetty kaksi nivoutuvaa tutkimusprojektia, joista ensimmäinen keskittyy miehittämättömän käytön kehittämiseen ja toinen keventää hankintaketjun materiaalivirtaa. Ensin esiteltävä projekti on elektroniikkateollisuudelle uusi aluevaltaus. Siksi asiassa on jouduttu tekemään runsaasti perustutkimusta ja etenemään etappi kerrallaan. Projektin tuloksena on saatu aikaan automaattinen, joustava kuljetin, joka yksinään pystyy toimimaan ilman työvoimaa niin pitkään kunnes sen akut pitää ladata. Joustavan kuljettimen sisältävän järjestelmän muita elementtejä ei projektissa nostettu samalle miehittämättömän käytön tasolle, vaan ensin päätettiin hankkia kokemuksia kuljettimen toiminnasta. Toinen projekti syventää yhteistyötä yrityksen alihankkijan kanssa. Alihankkija on yksi harvojen Suomesta hankittavien komponenttien toimittaja. Siksi on pidetty tärkeänä tiivistää toimitusaikaa lähemmäs yrityksen sitovan tilauskannan pituutta. Tutkittavaksi valittiin mahdollisuus siirtää alihankkijan tuotantolaite yrityksen välittömään läheisyyteen, jolloin yrityksen tuotanto voisi tilata komponenttierän alihankkijan koneelta, kun se lähtötilanteessa tilataan komponenttivarastosta. Ennusteen perusteella tilattaisiin vain raaka-aine, mutta arvoa lisäävä työ tehtäisiin vain ja ainoastaan tilauksen mukaisesti.
Resumo:
The thesis studies role based access control and its suitability in the enterprise environment. The aim is to research how extensively role based access control can be implemented in the case organization and how it support organization’s business and IT functions. This study points out the enterprise’s needs for access control, factors of access control in the enterprise environment and requirements for implementation and the benefits and challenges it brings along. To find the scope how extensively role based access control can be implemented into the case organization, firstly is examined the actual state of access control. Secondly is defined a rudimentary desired state (how things should be) and thirdly completed it by using the results of the implementation of role based access control application. The study results the role model for case organization unit, and the building blocks and the framework for the organization wide implementation. Ultimate value for organization is delivered by facilitating the normal operations of the organization whilst protecting its information assets.
Resumo:
Glucose is the primary source of energy for the brain but also an important source of building blocks for proteins, lipids, and nucleic acids. Little is known about the use of glucose for biosynthesis in tissues at the cellular level. We demonstrate that local cerebral metabolic activity can be mapped in mouse brain tissue by quantitatively imaging the biosynthetic products deriving from [U-(13)C]glucose metabolism using a combination of in situ electron microscopy and secondary ion mass-spectroscopy (NanoSIMS). Images of the (13)C-label incorporated into cerebral ultrastructure with ca. 100nm resolution allowed us to determine the timescale on which the metabolic products of glucose are incorporated into different cells, their sub-compartments and organelles. These were mapped in astrocytes and neurons in the different layers of the motor cortex. We see evidence for high metabolic activity in neurons via the nucleus (13)C enrichment. We observe that in all the major cell compartments, such as e.g. nucleus and Golgi apparatus, neurons incorporate substantially higher concentrations of (13)C-label than astrocytes.