912 resultados para fundamental principles and applications
Resumo:
Information processing in the human brain has always been considered as a source of inspiration in Artificial Intelligence; in particular, it has led researchers to develop different tools such as artificial neural networks. Recent findings in Neurophysiology provide evidence that not only neurons but also isolated and networks of astrocytes are responsible for processing information in the human brain. Artificial neural net- works (ANNs) model neuron-neuron communications. Artificial neuron-glia networks (ANGN), in addition to neuron-neuron communications, model neuron-astrocyte con- nections. In continuation of the research on ANGNs, first we propose, and evaluate a model of adaptive neuro fuzzy inference systems augmented with artificial astrocytes. Then, we propose a model of ANGNs that captures the communications of astrocytes in the brain; in this model, a network of artificial astrocytes are implemented on top of a typical neural network. The results of the implementation of both networks show that on certain combinations of parameter values specifying astrocytes and their con- nections, the new networks outperform typical neural networks. This research opens a range of possibilities for future work on designing more powerful architectures of artificial neural networks that are based on more realistic models of the human brain.
Resumo:
Peer reviewed
Resumo:
While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.
In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.
By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.
Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.
Resumo:
This work explores the idea of constitutional justice in Africa with a focus on constitutional interpretation in Ghana and Nigeria. The objective is to develop a theory of constitutional interpretation based upon a conception of law that allows the existing constitutions of Ghana and Nigeria to be construed by the courts as law in a manner that best serves the collective wellbeing of the people. The project involves an examination of both legal theory and substantive constitutional law. The theoretical argument will be applied to show how a proper understanding of the ideals of the rule of law and constitutionalism in Ghana and Nigeria necessitate the conclusion that socio-economic rights in those countries are constitutionally protected and judicially enforceable. The thesis argues that this conclusion follows from a general claim that constitutions should represent a ‘fundamental law’ and must be construed as an aspirational moral ideal for the common good of the people. The argument is essentially about the inherent character of ‘legality’ or the ‘rule of law.’ It weaves together ideas developed by Lon Fuller, Ronald Dworkin, T.R.S. Allan and David Dyzenhaus, as well as the strand of common law constitutionalism associated with Sir Edward Coke, to develop a moral sense of ‘law’ that transcends the confines of positive or explicit law while remaining inherently ‘legal’ as opposed to purely moral or political. What emerges is an unwritten fundamental law of reason located between pure morality or natural law on the one hand and strict, explicit, or positive law on the other. It is argued that this fundamental law is, or should be, the basis of constitutional interpretation, especially in transitional democracies like Ghana and Nigeria, and that it grounds constitutional protection for socio-economic rights. Equipped with this theory of law, courts in developing African countries like Ghana and Nigeria will be in a better position to contribute towards developing a real sense of constitutional justice for Africa.
Resumo:
The authors explored whether a testing effect occurs not only for retention of facts but also for application of principles and procedures. For that purpose, 38 high school students either repeatedly studied a text on probability calculations or studied the text, took a test on the content, restudied the text, and finally took the test a second time. Results show that testing not only leads to better retention of facts than restudying, but also to better application of acquired knowledge (i.e., principles and procedures) in high school statistics. In other words, testing seems not only to benefit fact retention, but also positively affects deeper learning.
Resumo:
Laser-plasma based accelerators of protons and heavier ions are a source of potential interest for several applications, including in the biomedical area. While the potential future use in cancer hadrontherapy acts as a strong aspirational motivation for this research field, radiobiology employing laser-driven ion bursts is alreadyan active field of research. Here we give a summary of the state of the art in laser driven ion acceleration, of the main challenges currently faced by the research inthis field and of some of the current and future strategies for overcoming them.
Resumo:
In this paper, we consider the transmission of confidential information over a κ-μ fading channel in the presence of an eavesdropper who also experiences κ-μ fading. In particular, we obtain novel analytical solutions for the probability of strictly positive secrecy capacity (SPSC) and a lower bound of secure outage probability (SOPL) for independent and non-identically distributed channel coefficients without parameter constraints. We also provide a closed-form expression for the probability of SPSC when the μ parameter is assumed to take positive integer values. Monte-Carlo simulations are performed to verify the derived results. The versatility of the κ-μ fading model means that the results presented in this paper can be used to determine the probability of SPSC and SOPL for a large number of other fading scenarios, such as Rayleigh, Rice (Nakagamin), Nakagami-m, One-Sided Gaussian, and mixtures of these common fading models. In addition, due to the duality of the analysis of secrecy capacity and co-channel interference (CCI), the results presented here will have immediate applicability in the analysis of outage probability in wireless systems affected by CCI and background noise (BN). To demonstrate the efficacy of the novel formulations proposed here, we use the derived equations to provide a useful insight into the probability of SPSC and SOPL for a range of emerging wireless applications, such as cellular device-to-device, peer-to-peer, vehicle-to-vehicle, and body centric communications using data obtained from real channel measurements.
Resumo:
Human societies are reliant on the functioning of the hydrologic cycle. The atmospheric branch of this cycle, often referred to as moisture recycling in the context of land-to-land exchange, refers to water evaporating, traveling through the atmosphere, and falling out as precipitation. Similar to the surface water cycle that uses the watershed as the unit of analysis, it is also possible to consider a ‘watershed of the sky’ for the atmospheric water cycle. Thus, I explore the precipitationshed - defined as the upwind surface of the Earth that provides evaporation that later falls as precipitation in a specific place. The primary contributions of this dissertation are to (a) introduce the precipitationshed concept, (b) provide a quantitative basis for the study of the precipitationshed, and (c) demonstrate its use in the fields of hydrometeorology, land-use change, social-ecological systems, ecosystem services, and environmental governance. In Paper I, the concept of the precipitationshed is introduced and explored for the first time. The quantification of precipitationshed variability is described in Paper II, and the key finding is that the precipitationsheds for multiple regions are persistent in time and space. Moisture recycling is further described as an ecosystem service in Paper III, to integrate the concept into the existing language of environmental sustainability and management. That is, I identify regions where vegetation more strongly regulates the provision of atmospheric water, as well as the regions that more strongly benefit from this regulation. In Paper IV, the precipitationshed is further explored through the lens of urban reliance on moisture recycling. Using a novel method, I quantify the vulnerability of urban areas to social-ecological changes within their precipitationsheds. In Paper V, I argue that successful moisture recycling governance will require flexible, transboundary institutions that are capable of operating within complex social-ecological systems. I conclude that, in the future, the precipitationshed can be a key tool in addressing the complexity of social-ecological systems.
Resumo:
Educational systems worldwide are facing an enormous shift as a result of sociocultural, political, economic, and technological changes. The technologies and practices that have developed over the last decade have been heralded as opportunities to transform both online and traditional education systems. While proponents of these new ideas often postulate that they have the potential to address the educational problems facing both students and institutions and that they could provide an opportunity to rethink the ways that education is organized and enacted, there is little evidence of emerging technologies and practices in use in online education. Because researchers and practitioners interested in these possibilities often reside in various disciplines and academic departments the sharing and dissemination of their work across often rigid boundaries is a formidable task. Contributors to Emergence and Innovation in Digital Learning include individuals who are shaping the future of online learning with their innovative applications and investigations on the impact of issues such as openness, analytics, MOOCs, and social media. Building on work first published in Emerging Technologies in Distance Education, the contributors to this collection harness the dispersed knowledge in online education to provide a one-stop locale for work on emergent approaches in the field. Their conclusions will influence the adoption and success of these approaches to education and will enable researchers and practitioners to conceptualize, critique, and enhance their understanding of the foundations and applications of new technologies.
Resumo:
A collection of lecture notes for a short course prepared by the Fire Safety Engineering Group - University of Greenwich.
Resumo:
The main results of this paper are twofold: the first one is a matrix theoretical result. We say that a matrix is superregular if all of its minors that are not trivially zero are nonzero. Given a a×b, a ≥ b, superregular matrix over a field, we show that if all of its rows are nonzero then any linear combination of its columns, with nonzero coefficients, has at least a−b + 1 nonzero entries. Secondly, we make use of this result to construct convolutional codes that attain the maximum possible distance for some fixed parameters of the code, namely, the rate and the Forney indices. These results answer some open questions on distances and constructions of convolutional codes posted in the literature.
Resumo:
The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).
Resumo:
The main drivers for the development and evolution of Cyber Physical Systems (CPS) are the reduction of development costs and time along with the enhancement of the designed products. The aim of this survey paper is to provide an overview of different types of system and the associated transition process from mechatronics to CPS and cloud-based (IoT) systems. It will further consider the requirement that methodologies for CPS-design should be part of a multi-disciplinary development process within which designers should focus not only on the separate physical and computational components, but also on their integration and interaction. Challenges related to CPS-design are therefore considered in the paper from the perspectives of the physical processes, computation and integration respectively. Illustrative case studies are selected from different system levels starting with the description of the overlaying concept of Cyber Physical Production Systems (CPPSs). The analysis and evaluation of the specific properties of a sub-system using a condition monitoring system, important for the maintenance purposes, is then given for a wind turbine.