947 resultados para Algorithmic skeleton
Resumo:
The change in the carbonaceous skeleton of nanoporous carbons during their activation has received limited attention, unlike its counterpart process in the presence of an inert atmosphere. Here we adopt a multi-method approach to elucidate this change in a poly(furfuryl alcohol)-derived carbon activated using cyclic application of oxygen saturation at 250 °C before its removal (with carbon) at 800 °C in argon. The methods used include helium pycnometry, synchrotron-based X-ray diffraction (XRD) and associated radial distribution function (RDF) analysis, transmission electron microscopy (TEM) and, uniquely, electron energy-loss spectroscopy spectrum-imaging (EELS-SI), electron nanodiffraction and fluctuation electron microscopy (FEM). Helium pycnometry indicates the solid skeleton of the carbon densifies during activation from 78% to 93% of graphite. RDF analysis, EELS-SI, and FEM all suggest this densification comes through an in-plane growth of sp2 carbon out to the medium range without commensurate increase in order normal to the plane. This process could be termed ‘graphenization’. The exact way in which this process occurs is not clear, but TEM images of the carbon before and after activation suggest it may come through removal of the more reactive carbon, breaking constraining cross-links and creating space that allows the remaining carbon material to migrate in an annealing-like process.
Resumo:
We report a multi-wavelength Raman spectroscopy study of the structural changes along the thermal annealing pathway of a poly(furfuryl alcohol) (PFA) derived nanoporous carbon (NPC). The Raman spectra were deconvoluted utilizing G, D, D′, A and TPA bands. The appropriateness of these deconvolutions was confirmed via recovery of the correct dispersive behaviours of these bands. It is proposed that the ID/IG ratio is composed of two parts: one associated with the extent of graphitic crystallites (the Tuinstra–Koenig relationship), and a second related to the inter-defect distance. This model was used to successfully determine the variation of the in-plane size and intra-plane defect density along the annealing pathway. It is proposed that the NPC skeleton evolves along the annealing pathway in two stages: below 1600 °C it was dominated by a reduction of in-plane defects with a minor crystallite growth, and above this temperature growth of the crystallites accelerates as the in-plane defect density approaches zero. A significant amount of transpolyacetylene (TPA)-like structures was found to be remaining even at 2400 °C. These may be responsible for resistance to further graphitization of the PFA-based carbon at higher temperatures.
Resumo:
To tackle the challenges at circuit level and system level VLSI and embedded system design, this dissertation proposes various novel algorithms to explore the efficient solutions. At the circuit level, a new reliability-driven minimum cost Steiner routing and layer assignment scheme is proposed, and the first transceiver insertion algorithmic framework for the optical interconnect is proposed. At the system level, a reliability-driven task scheduling scheme for multiprocessor real-time embedded systems, which optimizes system energy consumption under stochastic fault occurrences, is proposed. The embedded system design is also widely used in the smart home area for improving health, wellbeing and quality of life. The proposed scheduling scheme for multiprocessor embedded systems is hence extended to handle the energy consumption scheduling issues for smart homes. The extended scheme can arrange the household appliances for operation to minimize monetary expense of a customer based on the time-varying pricing model.
Resumo:
Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).
Resumo:
Novice programmers have difficulty developing an algorithmic solution while simultaneously obeying the syntactic constraints of the target programming language. To see how students fare in algorithmic problem solving when not burdened by syntax, we conducted an experiment in which a large class of beginning programmers were required to write a solution to a computational problem in structured English, as if instructing a child, without reference to program code at all. The students produced an unexpectedly wide range of correct, and attempted, solutions, some of which had not occurred to their teachers. We also found that many common programming errors were evident in the natural language algorithms, including failure to ensure loop termination, hardwiring of solutions, failure to properly initialise the computation, and use of unnecessary temporary variables, suggesting that these mistakes are caused by inexperience at thinking algorithmically, rather than difficulties in expressing solutions as program code.
Resumo:
The management of main material prices of provincial highway project quota has problems of lag and blindness. Framework of provincial highway project quota data MIS and main material price data warehouse were established based on WEB firstly. Then concrete processes of provincial highway project main material prices were brought forward based on BP neural network algorithmic. After that standard BP algorithmic, additional momentum modify BP network algorithmic, self-adaptive study speed improved BP network algorithmic were compared in predicting highway project main prices. The result indicated that it is feasible to predict highway main material prices using BP NN, and using self-adaptive study speed improved BP network algorithmic is the relatively best one.
Resumo:
The solution of linear ordinary differential equations (ODEs) is commonly taught in first year undergraduate mathematics classrooms, but the understanding of the concept of a solution is not always grasped by students until much later. Recognising what it is to be a solution of a linear ODE and how to postulate such solutions, without resorting to tables of solutions, is an important skill for students to carry with them to advanced studies in mathematics. In this study we describe a teaching and learning strategy that replaces the traditional algorithmic, transmission presentation style for solving ODEs with a constructive, discovery based approach where students employ their existing skills as a framework for constructing the solutions of first and second order linear ODEs. We elaborate on how the strategy was implemented and discuss the resulting impact on a first year undergraduate class. Finally we propose further improvements to the strategy as well as suggesting other topics which could be taught in a similar manner.
Resumo:
Bone is important because it provides the skeleton structural integrity and enables movement and locomotion. Its development and morphology follow its function. It adapts to changes of mechanical loading and has the ability to repair itself after damage or fracture. The processes of bone development, bone adaptation, and bone regeneration in fracture healing are regulated, in part, by mechanical stimuli that result when the bone is loaded.
Resumo:
The SoundCipher software library provides an easy way to create music in the Processing development environment. With the SoundCipher library added to Processing you can write software programs that make music to go along with your graphics and you can add sounds to enhance your Processing animations or games. SoundCipher provides an easy interface for playing 'notes' on the JavaSound synthesizer, for playback of audio files, and comunicating via MIDI. It provides accurate scheduling and allows events to be organised in musical time; using beats and tempo. It uses a 'score' metaphor that allows the construction of simple or complex musical arrangements. SoundCipher is designed to facilitate the basics of algorithmic music and interactive sound design as well as providing a platform for sophisticated computational music, it allows integration with the Minim library when more sophisticated audio and synthesis functionality is required and integration with the oscP5 library for communicating via open sound control.
Resumo:
An issue on generative music in Contemporary Music Review allows space to explore many of these controversies, and to explore the rich algorithmic scene in contemporary practice, as well as the diverse origins and manifestations of such a culture. A roster of interesting exponents from both academic and arts practice backgrounds are involved, matching the broad spectrum of current work. Contributed articles range from generative algorithms in live systems, from live coding to interactive music systems to computer games, through algorithmic modelling of longer-term form, evolutionary algorithms, to interfaces between modalities and mediums, in algorithmic choreography. A retrospective on the intensive experimentation into algorithmic music and sound synthesis at the Institute of Sonology in the 1960s and 70s creates a complementary strand, as well as an open paper on the issues raised by open source, as opposed to proprietary, software and operating systems, with consequences in the creation and archiving of algorithmic work.
Resumo:
There are many interactive media systems, including computer games and media art works, in which it is desirable for music to vary in response to changes in the environment. In this paper we will outline a range of algorithmic techniques that enable music to adapt to such changes, taking into account the need for the music to vary in its expressiveness or mood while remaining coherent and recognisable. We will discuss the approaches which we have arrived at after experience in a range of adaptive music systems over recent years, and draw upon these experiences to inform discussion of relevant considerations and to illustrate the techniques and their effect.
Resumo:
This paper describes the use of the Chimera Architecture as the basis for a generative rhythmic improvisation system that is intended for use in ensemble contexts. This interactive soft- ware system learns in real time based on an audio input from live performers. The paper describes the components of the Chimera Architecture including a novel analysis engine that uses prediction to robustly assess the rhythmic salience of the input stream. Analytical results are stored in a hierarchical structure that includes multiple scenarios which allow ab- stracted and alternate interpretations of the current metrical context. The system draws upon this Chimera Architecture when generating a musical response. The generated rhythms are intended to have a particular ambiguity in relation to the music performance by other members of the ensemble. Ambi- guity is controlled through alternate interpretations of the Chimera. We describe an implementation of the Chimera Ar- chitecture that focuses on rhythmic material, and present and discuss initial experimental results of the software system playing along with recordings of a live performance.
Resumo:
SoundCipher is a software library written in the Java language that adds important music and sound features to the Processing environment that is widely used by media artists and otherwise has an orientation toward computational graphics. This article introduces the SoundCipher library and its features, describes its influences and design intentions, and positions it within the field of computer music programming tools. SoundCipher enables the rich history of algorithmic music techniques to be accessible within one of today’s most popular media art platforms. It also provides an accessible means for learning to create algorithmic music and sound programs.
Resumo:
This paper explores a method of comparative analysis and classification of data through perceived design affordances. Included is discussion about the musical potential of data forms that are derived through eco-structural analysis of musical features inherent in audio recordings of natural sounds. A system of classification of these forms is proposed based on their structural contours. The classifications include four primitive types; steady, iterative, unstable and impulse. The classification extends previous taxonomies used to describe the gestural morphology of sound. The methods presented are used to provide compositional support for eco-structuralism.