796 resultados para unconventional computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Valmistustekniikoiden kehittyessä IC-piireille saadaan mahtumaan yhä enemmän transistoreja. Monimutkaisemmat piirit mahdollistavat suurempien laskutoimitusmäärien suorittamisen aikayksikössä. Piirien aktiivisuuden lisääntyessä myös niiden energiankulutus lisääntyy, ja tämä puolestaan lisää piirin lämmöntuotantoa. Liiallinen lämpö rajoittaa piirien toimintaa. Tämän takia tarvitaan tekniikoita, joilla piirien energiankulutusta saadaan pienennettyä. Uudeksi tutkimuskohteeksi ovat tulleet pienet laitteet, jotka seuraavat esimerkiksi ihmiskehon toimintaa, rakennuksia tai siltoja. Tällaisten laitteiden on oltava energiankulutukseltaan pieniä, jotta ne voivat toimia pitkiä aikoja ilman akkujen lataamista. Near-Threshold Computing on tekniikka, jolla pyritään pienentämään integroitujen piirien energiankulutusta. Periaatteena on käyttää piireillä pienempää käyttöjännitettä kuin piirivalmistaja on niille alunperin suunnitellut. Tämä hidastaa ja haittaa piirin toimintaa. Jos kuitenkin laitteen toiminnassa pystyään hyväksymään huonompi laskentateho ja pienentynyt toimintavarmuus, voidaan saavuttaa säästöä energiankulutuksessa. Tässä diplomityössä tarkastellaan Near-Threshold Computing -tekniikkaa eri näkökulmista: aluksi perustuen kirjallisuudesta löytyviin aikaisempiin tutkimuksiin, ja myöhemmin tutkimalla Near-Threshold Computing -tekniikan soveltamista kahden tapaustutkimuksen kautta. Tapaustutkimuksissa tarkastellaan FO4-invertteriä sekä 6T SRAM -solua piirisimulaatioiden avulla. Näiden komponenttien käyttäytymisen Near-Threshold Computing –jännitteillä voidaan tulkita antavan kattavan kuvan suuresta osasta tavanomaisen IC-piirin pinta-alaa ja energiankulusta. Tapaustutkimuksissa käytetään 130 nm teknologiaa, ja niissä mallinnetaan todellisia piirivalmistusprosessin tuotteita ajamalla useita Monte Carlo -simulaatioita. Tämä valmistuskustannuksiltaan huokea teknologia yhdistettynä Near-Threshold Computing -tekniikkaan mahdollistaa matalan energiankulutuksen piirien valmistaminen järkevään hintaan. Tämän diplomityön tulokset näyttävät, että Near-Threshold Computing pienentää piirien energiankulutusta merkittävästi. Toisaalta, piirien nopeus heikkenee, ja yleisesti käytetty 6T SRAM -muistisolu muuttuu epäluotettavaksi. Pidemmät polut logiikkapiireissä sekä transistorien kasvattaminen muistisoluissa osoitetaan tehokkaiksi vastatoimiksi Near- Threshold Computing -tekniikan huonoja puolia vastaan. Tulokset antavat perusteita matalan energiankulutuksen IC-piirien suunnittelussa sille, kannattaako käyttää normaalia käyttöjännitettä, vai laskea sitä, jolloin piirin hidastuminen ja epävarmempi käyttäytyminen pitää ottaa huomioon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In accordance with the Moore's law, the increasing number of on-chip integrated transistors has enabled modern computing platforms with not only higher processing power but also more affordable prices. As a result, these platforms, including portable devices, work stations and data centres, are becoming an inevitable part of the human society. However, with the demand for portability and raising cost of power, energy efficiency has emerged to be a major concern for modern computing platforms. As the complexity of on-chip systems increases, Network-on-Chip (NoC) has been proved as an efficient communication architecture which can further improve system performances and scalability while reducing the design cost. Therefore, in this thesis, we study and propose energy optimization approaches based on NoC architecture, with special focuses on the following aspects. As the architectural trend of future computing platforms, 3D systems have many bene ts including higher integration density, smaller footprint, heterogeneous integration, etc. Moreover, 3D technology can signi cantly improve the network communication and effectively avoid long wirings, and therefore, provide higher system performance and energy efficiency. With the dynamic nature of on-chip communication in large scale NoC based systems, run-time system optimization is of crucial importance in order to achieve higher system reliability and essentially energy efficiency. In this thesis, we propose an agent based system design approach where agents are on-chip components which monitor and control system parameters such as supply voltage, operating frequency, etc. With this approach, we have analysed the implementation alternatives for dynamic voltage and frequency scaling and power gating techniques at different granularity, which reduce both dynamic and leakage energy consumption. Topologies, being one of the key factors for NoCs, are also explored for energy saving purpose. A Honeycomb NoC architecture is proposed in this thesis with turn-model based deadlock-free routing algorithms. Our analysis and simulation based evaluation show that Honeycomb NoCs outperform their Mesh based counterparts in terms of network cost, system performance as well as energy efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis discusses the opportunities and challenges of the cloud computing technology in healthcare information systems by reviewing the existing literature on cloud computing and healthcare information system and the impact of cloud computing technology to healthcare industry. The review shows that if problems related to security of data are solved then cloud computing will positively transform the healthcare institutions by giving advantage to the healthcare IT infrastructure as well as improving and giving benefit to healthcare services. Therefore, this thesis will explore the opportunities and challenges that are associated with cloud computing in the context of Finland in order to help the healthcare organizations and stakeholders to determine its direction when it decides to adopt cloud technology on their information systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Video transcoding refers to the process of converting a digital video from one format into another format. It is a compute-intensive operation. Therefore, transcoding of a large number of simultaneous video streams requires a large amount of computing resources. Moreover, to handle di erent load conditions in a cost-e cient manner, the video transcoding service should be dynamically scalable. Infrastructure as a Service Clouds currently offer computing resources, such as virtual machines, under the pay-per-use business model. Thus the IaaS Clouds can be leveraged to provide a coste cient, dynamically scalable video transcoding service. To use computing resources e ciently in a cloud computing environment, cost-e cient virtual machine provisioning is required to avoid overutilization and under-utilization of virtual machines. This thesis presents proactive virtual machine resource allocation and de-allocation algorithms for video transcoding in cloud computing. Since users' requests for videos may change at di erent times, a check is required to see if the current computing resources are adequate for the video requests. Therefore, the work on admission control is also provided. In addition to admission control, temporal resolution reduction is used to avoid jitters in a video. Furthermore, in a cloud computing environment such as Amazon EC2, the computing resources are more expensive as compared with the storage resources. Therefore, to avoid repetition of transcoding operations, a transcoded video needs to be stored for a certain time. To store all videos for the same amount of time is also not cost-e cient because popular transcoded videos have high access rate while unpopular transcoded videos are rarely accessed. This thesis provides a cost-e cient computation and storage trade-o strategy, which stores videos in the video repository as long as it is cost-e cient to store them. This thesis also proposes video segmentation strategies for bit rate reduction and spatial resolution reduction video transcoding. The evaluation of proposed strategies is performed using a message passing interface based video transcoder, which uses a coarse-grain parallel processing approach where video is segmented at group of pictures level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smart phones became part and parcel of our life, where mobility provides a freedom of not being bounded by time and space. In addition, number of smartphones produced each year is skyrocketing. However, this also created discrepancies or fragmentation among devices and OSes, which in turn made an exceeding hard for developers to deliver hundreds of similar featured applications with various versions for the market consumption. This thesis is an attempt to investigate whether cloud based mobile development platforms can mitigate and eventually eliminate fragmentation challenges. During this research, we have selected and analyzed the most popular cloud based development platforms and tested integrated cloud features. This research showed that cloud based mobile development platforms may able to reduce mobile fragmentation and enable to utilize single codebase to deliver a mobile application for different platforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neurotransmitters are also involved in functions other than conventional signal transfer between nerve cells, such as development, plasticity, neurodegeneration, and neuroprotection. For example, there is a considerable amount of data indicating developmental roles for the glutamatergic, cholinergic, dopaminergic, GABA-ergic, and ATP/adenosine systems. In this review, we discuss the existing literature on these "new" functions of neurotransmitters in relation to some unconventional neurotransmitters, such as the endocannabinoids and nitric oxide. Data indicating both transcriptional and post-transcriptional modulation of endocannabinoid and nitrinergic systems after neural lesions are discussed in relation to the non-conventional roles of these neurotransmitters. Knowledge of the roles of neurotransmitters in brain functions other than information transfer is critical for a more complete understanding of the functional organization of the brain and to provide more opportunities for the development of therapeutical tools aimed at minimizing neuronal death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smart home implementation in residential buildings promises to optimize energy usage and save significant amount of energy simply due to a better understanding of user's energy usage profile. Apart from the energy optimisation prospects of this technology, it also aims to guarantee occupants significant amount of comfort and remote control over home appliances both at home locations and at remote places. However, smart home investment just like any other kind of investment requires an adequate measurement and justification of the economic gains it could proffer before its realization. These economic gains could differ for different occupants due to their inherent behaviours and tendencies. Thus it is pertinent to investigate the various behaviours and tendencies of occupants in different domain of interests and to measure the value of the energy savings accrued by smart home implementations in these domains of interest in order to justify such economic gains. This thesis investigates two domains of interests (the rented apartment and owned apartment) for primarily two behavioural tendencies (Finland and Germany) obtained from observation and corroborated by conducted interviews to measure the payback time and Return on Investment (ROI) of their smart home implementations. Also, similar measures are obtained for identified Australian use case. The research finding reveals that building automation for the Finnish behavioural tendencies seems to proffers a better ROI and payback time for smart home implementations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The power is still today an issue in wearable computing applications. The aim of the present paper is to raise awareness of the power consumption of wearable computing devices in specific scenarios to be able in the future to design energy efficient wireless sensors for context recognition in wearable computing applications. The approach is based on a hardware study. The objective of this paper is to analyze and compare the total power consumption of three representative wearable computing devices in realistic scenarios such as Display, Speaker, Camera and microphone, Transfer by Wi-Fi, Monitoring outdoor physical activity and Pedometer. A scenario based energy model is also developed. The Samsung Galaxy Nexus I9250 smartphone, the Vuzix M100 Smart Glasses and the SimValley Smartwatch AW-420.RX are the three devices representative of their form factors. The power consumption is measured using PowerTutor, an android energy profiler application with logging option and using unknown parameters so it is adjusted with the USB meter. The result shows that the screen size is the main parameter influencing the power consumption. The power consumption for an identical scenario varies depending on the wearable devices meaning that others components, parameters or processes might impact on the power consumption and further study is needed to explain these variations. This paper also shows that different inputs (touchscreen is more efficient than buttons controls) and outputs (speaker sensor is more efficient than display sensor) impact the energy consumption in different way. This paper gives recommendations to reduce the energy consumption in healthcare wearable computing application using the energy model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Manufacturing industry has been always facing challenge to improve the production efficiency, product quality, innovation ability and struggling to adopt cost-effective manufacturing system. In recent years cloud computing is emerging as one of the major enablers for the manufacturing industry. Combining the emerged cloud computing and other advanced manufacturing technologies such as Internet of Things, service-oriented architecture (SOA), networked manufacturing (NM) and manufacturing grid (MGrid), with existing manufacturing models and enterprise information technologies, a new paradigm called cloud manufacturing is proposed by the recent literature. This study presents concepts and ideas of cloud computing and cloud manufacturing. The concept, architecture, core enabling technologies, and typical characteristics of cloud manufacturing are discussed, as well as the difference and relationship between cloud computing and cloud manufacturing. The research is based on mixed qualitative and quantitative methods, and a case study. The case is a prototype of cloud manufacturing solution, which is software platform cooperated by ATR Soft Oy and SW Company China office. This study tries to understand the practical impacts and challenges that are derived from cloud manufacturing. The main conclusion of this study is that cloud manufacturing is an approach to achieve the transformation from traditional production-oriented manufacturing to next generation service-oriented manufacturing. Many manufacturing enterprises are already using a form of cloud computing in their existing network infrastructure to increase flexibility of its supply chain, reduce resources consumption, the study finds out the shift from cloud computing to cloud manufacturing is feasible. Meanwhile, the study points out the related theory, methodology and application of cloud manufacturing system are far from maturity, it is still an open field where many new technologies need to be studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to develop fettuccini type rice fresh pasta by cold extrusion. To produce the pasta, a 2² Central Composite Rotational Design was used, in which the effects of the addition of pre-gelatinized rice flour - PGRF (0-60%) and modified egg albumin - MEA (0-10%) were studied. The dependent variables were the results of the cooking test and of the instrumental texture. The optimum cooking time for all of the formulations of rice fresh pasta was 3 minutes. MEA had a greater effect on increasing the weight of the pasta when compared to that of PGRF. It was found that with the addition of PGRF increase in loss of solids in cooking water, whereas MEA exerted the opposite effect on this parameter. Moreover, the maximum value of MEA (10%) had an optimum effect on pasta firmness, while PGRF had a negative effect on this parameter. The maximum values of PGRF and MEA reduced the stickiness of the pasta. Based on these results and on the parameters considered as most important, the rice pasta with the best technological characteristics was that with the maximum levels of MEA (10%) and no addition of PGRF (0%). This product was submitted to sensory and microbiological analyses, with good results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integrins are cell surface adhesion and signaling receptors. Cells use integrins to attach to the extracellular matrix and to other cells, as well as for sensing their environment. In addition to adhesion and migration, integrins have been shown to be important for many biological processes including apoptosis, cell proliferation, and differentiation into specific tissues. Many important next generation biological drugs inhibit integrin functions. Thus, research into interactions between integrins and their ligands under different physiological and pathological conditions is not only of academic interest, but is also important for the field of drug discovery. In this Ph.D. project, the functions of integrin-ligand interactions were studied under different physiologically interesting conditions including 1) human echovirus 1 binding to integrin α2β1, 2) integrin α2β1 binding to collagen under flow conditions, 3) integrin α2β1 binding to a ligand in the presence of the angiogenesis inhibitor histidine rich glycoprotein (HRG) and 4) integrin binding to posttranslationally citrullinated ligands. As a result of the project, we could show that for each condition the integrin-ligand interaction is somewhat unconventional. 1) Echovirus 1 binds only to non-activated conformations of integrin α2β1. 2) Surprisingly, the non-activated conformation is also the primary conformation of integrin α2β1 when it binds to collagen under flow conditions, like when platelets adhere to subendothelial collagen in vascular injuries. In addition, the pre-activation of integrin α2β1 does not increase adhesion under flow. 3) HRG binds to integrin α2β1 through a low-affinity interaction that inhibits integrin binding to collagen. This shows that low affinity interactions could be biologically relevant and possibly regulate angiogenesis. 4) The citrullination of collagen, a posttranslational modification reported to occur in rheumatoid arthritis, specifically inhibits the binding of integrin α10β1 and α11β1, but does not affect the binding of α1β1 ja α2β1. On the other hand, the citrullination of isoDGR in fibronectin and RGD in pro-TGF- β:n inhibit integrin binding completely. Citrullination seems to be an inflammation related process and integrin ligands become citrullinated frequently in vivo. This Ph.D. thesis suggests that unconventional interaction mechanisms between integrins and their ligands, such as posttranslational modifications, low affinity interactions, and non-activated integrin conformations, can have an important role in pathological processes. The study of these kinds of integrin-ligand interactions is important for understanding biological phenomena more deeply. The research might also be beneficial for the development of integrin based therapies for treating diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Variations in different types of genomes have been found to be responsible for a large degree of physical diversity such as appearance and susceptibility to disease. Identification of genomic variations is difficult and can be facilitated through computational analysis of DNA sequences. Newly available technologies are able to sequence billions of DNA base pairs relatively quickly. These sequences can be used to identify variations within their specific genome but must be mapped to a reference sequence first. In order to align these sequences to a reference sequence, we require mapping algorithms that make use of approximate string matching and string indexing methods. To date, few mapping algorithms have been tailored to handle the massive amounts of output generated by newly available sequencing technologies. In otrder to handle this large amount of data, we modified the popular mapping software BWA to run in parallel using OpenMPI. Parallel BWA matches the efficiency of multithreaded BWA functions while providing efficient parallelism for BWA functions that do not currently support multithreading. Parallel BWA shows significant wall time speedup in comparison to multithreaded BWA on high-performance computing clusters, and will thus facilitate the analysis of genome sequencing data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we consider the properties of planar topological defects in unconventional superconductors. Specifically, we calculate microscopically the interaction energy of domain walls separating degenerate ground states in a chiral p-wave fermionic superfluid. The interaction is mediated by the quasiparticles experiencing Andreev scattering at the domain walls. As a by-product, we derive a useful general expression for the free energy of an arbitrary nonuniform texture of the order parameter in terms of the quasiparticle scattering matrix. The thesis is structured as follows. We begin with a historical review of the theories of superconductivity (Sec. 1.1), which led the way to the celebrated Bardeen-Cooper- Schrieffer (BCS) theory (Sec. 1.3). Then we proceed to the treatment of superconductors with so-called "unconventional pairing" in Sec. 1.4, and in Sec. 1.5 we introduce the specific case of chiral p-wave superconductivity. After introducing in Sec. 2 the domain wall (DW) model that will be considered throughout the work, we derive the Bogoliubov-de Gennes (BdG) equations in Sec. 3.1, which determine the quasiparticle excitation spectrum for a nonuniform superconductor. In this work, we use the semiclassical (Andreev) approximation, and solve the Andreev equations (which are a particular case of the BdG equations) in Sec. 4 to determine the quasiparticle spectrum for both the single- and two-DW textures. The Andreev equations are derived in Sec. 3.2, and the formal properties of the Andreev scattering coefficients are discussed in the following subsection. In Sec. 5, we use the transfer matrix method to relate the interaction energy of the DWs to the scattering matrix of the Bogoliubov quasiparticles. This facilitates the derivation of an analytical expression for the interaction energy between the two DWs in Sec. 5.3. Finally, to illustrate the general applicability our method, we apply it in Sec. 6 to the interaction between phase solitons in a two-band s-wave superconductor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microarray data analysis is one of data mining tool which is used to extract meaningful information hidden in biological data. One of the major focuses on microarray data analysis is the reconstruction of gene regulatory network that may be used to provide a broader understanding on the functioning of complex cellular systems. Since cancer is a genetic disease arising from the abnormal gene function, the identification of cancerous genes and the regulatory pathways they control will provide a better platform for understanding the tumor formation and development. The major focus of this thesis is to understand the regulation of genes responsible for the development of cancer, particularly colorectal cancer by analyzing the microarray expression data. In this thesis, four computational algorithms namely fuzzy logic algorithm, modified genetic algorithm, dynamic neural fuzzy network and Takagi Sugeno Kang-type recurrent neural fuzzy network are used to extract cancer specific gene regulatory network from plasma RNA dataset of colorectal cancer patients. Plasma RNA is highly attractive for cancer analysis since it requires a collection of small amount of blood and it can be obtained at any time in repetitive fashion allowing the analysis of disease progression and treatment response.