524 resultados para implements


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation consists of three distinct components: (1) “Double Rainbow,” a notated composition for an acoustic ensemble of 10 instruments, ca. 36 minutes. (2) “Appalachiana”, a fixed-media composition for electro-acoustic music and video, ca. 30 minutes, and (3) “'The Invisible Mass': Exploring Compositional Technique in Alfred Schnittke’s Second Symphony”, an analytical article.

(1) Double Rainbow is a ca. 36 minute composition in four movements scored for 10 instruments: flute, Bb clarinet (doubling on bass clarinet), tenor saxophone (doubling on alto saxophone), french horn, percussion (glockenspiel, vibraphone, wood block, 3 toms, snare drum, bass drum, suspended cymbal), piano, violin, viola, cello, and double bass. Each of the four movements of the piece explore their own distinct character and set of compositional goals. The piece is presented as a musical score and as a recording, which was extensively treated in post-production.

(2) Appalachiana, is a ca. 30 minute fixed-media composition for music and video. The musical component was created as a vehicle to showcase several approaches to electro-acoustic music composition –fft re-synthesis for time manipulation effects, the use of a custom-built software instrument which implements generative approaches to creating rhythm and pitch patterns, using a recording of rain to create rhythmic triggers for software instruments, and recording additional components with acoustic instruments. The video component transforms footage of natural landscapes filmed at several locations in North Carolina, Virginia, and West Virginia into a surreal narrative using a variety of color, lighting, distortion, and time-manipulation video effects.

(3) “‘The Invisible Mass:’ Exploring Compositional Technique in Alfred Schnittke’s Second Symphony” is an analytical article that focuses on Alfred Schnittke’s compositional technique as evidenced in the construction of his Second Symphony and discussed by the composer in a number of previously untranslated articles and interviews. Though this symphony is pivotal in the composer’s oeuvre, there are currently no scholarly articles that offer in-depth analyses of the piece. The article combines analyses of the harmony, form, and orchestration in the Second Symphony with relevant quotations from the composer, some from published and translated sources and others newly translated by the author from research at the Russian State Library in St. Petersburg. These offer a perspective on how Schnittke’s compositional technique combines systematic geometric design with keen musical intuition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many countries have set challenging wind power targets to achieve by 2020. This paper implements a realistic analysis of curtailment and constraint of wind energy at a nodal level using a unit commitment and economic dispatch model of the Irish Single Electricity Market in 2020. The key findings show that significant reduction in curtailment can be achieved when the system non-synchronous penetration limit increases from 65% to 75%. For the period analyzed, this results in a decreased total generation cost and a reduction in the dispatch-down of wind. However, some nodes experience significant dispatch-down of wind, which can be in the order of 40%. This work illustrates the importance of implementing analysis at a nodal level for the purpose of power system planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die zunehmende Luftverschmutzung aufgrund des steigenden Energiebedarfs und Mobilitätsanspruchs der Bevölkerung, insbesondere in urbanen Gebieten, erhöht das Gefährdungspotential für die Gesundheit und verschlechtert so die Lebensqualität. Neben der Vermeidung von Emissionen toxischer Gase als mittel- und langfristig optimale Maßnahme zur Verbesserung der Luftqualität, stellt der Abbau emittierter Luftschadstoffe ein geeignetes und kurzfristig wirksames Mittel dar. Ein solcher Abbau kann durch Photokatalyse erzielt werden, allerdings nutzen Photokatalysatoren, die auf dem Halbleiter Titandioxid (TiO2) basieren, das solare Emissionsspektrum nur geringfüfig aus und sind in Innenräumen und anderen UV-schwachen Bereichen nicht wirksam. Um diese Nachteile zu überwinden, wurde ein Photokatalysator entwickelt und hergestellt, der aus TiO2 (P25) als UV-aktiver Photokatalysator und als Trägermaterial sowie einem seinerseits im Vis-Bereich photoaktiven Porphyrazin-Farbstoff als Beschichtung besteht. Die sterisch anspruchsvollen und in der Peripherie mit acht Bindungsmotiven für TiO2 versehenen Farbstoffmoleküle wurden zu diesem Zweck auf der Halbleiteroberfläche immobilisiert. Die so gebildeten Porphyrazin-Titandioxid-Hybride wurde ausführlich charakterisiert. Dabei wurden unter anderem die Bindung der Farbstoffe auf der Titandioxidoberfläche mittels Adsorptionsisothermen und die UV/Vis-spektroskopischen Eigenschaften des Hybridmaterials untersucht. Zur Bestimmung der photokatalytischen Aktivitäten der Einzelkomponenten und des Hybridmaterials wurden diese auf die Fähigkeit zur Bildung von Singulett-Sauerstoff, Wasserstoffperoxid und Hydroxylradikalen hin sowie in einem an die ISO-22197-1 angelehnten Verfahren auf die Fähigkeit zum Abbau von NO hin jeweils bei Bestrahlung in drei Wellenlängenbereichen (UV-Strahlung, blaues Licht und rotes Licht) geprüft. Darüber hinaus konnte die Aktivität des Hybridmaterials bei der Photodynamischen Inaktivierung (PDI) von Bakterien unter UV- und Rotlichtbestrahlung im Vergleich zum reinen Ttandioxid bestimmt werden. Die Charakterisierung des Hybridmaterials ergab, dass die Farbstoffmoleküle in einer neutralen Suspension nahezu irreversibel in einer monomolekularen Schicht mit einer Bindungsenergie von -41.43 kJ/mol an die Oberfläche gebunden sind und das Hybridmaterial mit hohen Extinktionskoeffizienten von bis zu 105 M-1cm-1 in großen Bereichen des UV/Vis-Spektrums Photonen absorbiert. Das Spektrum des Hybridmaterials setzt sich dabei additiv aus den beiden Einzelspektren zusammen. Die Auswirkungen der Charakterisierungsergebnisse auf die Bildung reaktiver Sauerstoffspezies wurden ausführlich diskutiert. Der Vergleich der Aktivitäten in Bezug auf die Bildung der reaktiven Sauerstoffspezies zeigte, dass die Aktivität des Hybridmaterials bis auf die bei der Bildung von Hydroxylradikalen unter UV-Bestrahlung in allen Versuchen deutlich höher war als die Aktivität des reinen Titandioxids. Im Gegensatz zu reinem Titandioxid erzeugte das Hybridmaterial in allen untersuchten Wellenlängenbereichen Mengen an Singulett-Sauerstoff, die photophysikalisch eindeutig detektierbar waren. Zur Erklärung und Deutung dieser Beobachtungen wurde eine differenzierte Diskussion geführt, die die Ergebnisse der Hybridpartikelcharakterisierung aufgreift und implementiert. Der Vergleich der NO-Abbaueffizienzen ergab bei allen Experimenten durchgängig deutlich höhere Werte für das Hybridmaterial. Zudem wurden durch das Hybridmaterial nachgewiesenermaßen wesentlich geringere Mengen des unerwünschten Nebenprodukts des Abbaus (NO2) gebildet. Im Zuge der Diskussion wurden verschiedene mögliche Mechanismen der „sauberen“ Oxidation zu Nitrat durch das Hybridmaterial vorgestellt. Untersuchungen zur Photodynamischen Inaktivierung verschiedener Bakterien ergaben, dass das Hybridmaterial neben einer zu P25 ähnlichen Aktivität unter UV-Bestrahlung, anders als P25, auch eine PDI verschiedener Bakterien unter Rotlichtbestrahlung erreicht.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report describes a tool for global optimization that implements the Differential Evolution optimization algorithm as a new Excel add-in. The tool takes a step beyond Excel’s Solver add-in, because Solver often returns a local minimum, that is, a minimum that is less than or equal to nearby points, while Differential Evolution solves for the global minimum, which includes all feasible points. Despite complex underlying mathematics, the tool is relatively easy to use, and can be applied to practical optimization problems, such as establishing pricing and awards in a hotel loyalty program. The report demonstrates an example of how to develop an optimum approach to that problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current Ambient Intelligence and Intelligent Environment research focuses on the interpretation of a subject’s behaviour at the activity level by logging the Activity of Daily Living (ADL) such as eating, cooking, etc. In general, the sensors employed (e.g. PIR sensors, contact sensors) provide low resolution information. Meanwhile, the expansion of ubiquitous computing allows researchers to gather additional information from different types of sensor which is possible to improve activity analysis. Based on the previous research about sitting posture detection, this research attempts to further analyses human sitting activity. The aim of this research is to use non-intrusive low cost pressure sensor embedded chair system to recognize a subject’s activity by using their detected postures. There are three steps for this research, the first step is to find a hardware solution for low cost sitting posture detection, second step is to find a suitable strategy of sitting posture detection and the last step is to correlate the time-ordered sitting posture sequences with sitting activity. The author initiated a prototype type of sensing system called IntelliChair for sitting posture detection. Two experiments are proceeded in order to determine the hardware architecture of IntelliChair system. The prototype looks at the sensor selection and integration of various sensor and indicates the best for a low cost, non-intrusive system. Subsequently, this research implements signal process theory to explore the frequency feature of sitting posture, for the purpose of determining a suitable sampling rate for IntelliChair system. For second and third step, ten subjects are recruited for the sitting posture data and sitting activity data collection. The former dataset is collected byasking subjects to perform certain pre-defined sitting postures on IntelliChair and it is used for posture recognition experiment. The latter dataset is collected by asking the subjects to perform their normal sitting activity routine on IntelliChair for four hours, and the dataset is used for activity modelling and recognition experiment. For the posture recognition experiment, two Support Vector Machine (SVM) based classifiers are trained (one for spine postures and the other one for leg postures), and their performance evaluated. Hidden Markov Model is utilized for sitting activity modelling and recognition in order to establish the selected sitting activities from sitting posture sequences.2. After experimenting with possible sensors, Force Sensing Resistor (FSR) is selected as the pressure sensing unit for IntelliChair. Eight FSRs are mounted on the seat and back of a chair to gather haptic (i.e., touch-based) posture information. Furthermore, the research explores the possibility of using alternative non-intrusive sensing technology (i.e. vision based Kinect Sensor from Microsoft) and find out the Kinect sensor is not reliable for sitting posture detection due to the joint drifting problem. A suitable sampling rate for IntelliChair is determined according to the experiment result which is 6 Hz. The posture classification performance shows that the SVM based classifier is robust to “familiar” subject data (accuracy is 99.8% with spine postures and 99.9% with leg postures). When dealing with “unfamiliar” subject data, the accuracy is 80.7% for spine posture classification and 42.3% for leg posture classification. The result of activity recognition achieves 41.27% accuracy among four selected activities (i.e. relax, play game, working with PC and watching video). The result of this thesis shows that different individual body characteristics and sitting habits influence both sitting posture and sitting activity recognition. In this case, it suggests that IntelliChair is suitable for individual usage but a training stage is required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Particle filtering has proven to be an effective localization method for wheeled autonomous vehicles. For a given map, a sensor model, and observations, occasions arise where the vehicle could equally likely be in many locations of the map. Because particle filtering algorithms may generate low confidence pose estimates under these conditions, more robust localization strategies are required to produce reliable pose estimates. This becomes more critical if the state estimate is an integral part of system control. We investigate the use of particle filter estimation techniques on a hovercraft vehicle. The marginally stable dynamics of a hovercraft require reliable state estimates for proper stability and control. We use the Monte Carlo localization method, which implements a particle filter in a recursive state estimate algorithm. An H-infinity controller, designed to accommodate the latency inherent in our state estimation, provides stability and controllability to the hovercraft. In order to eliminate the low confidence estimates produced in certain environments, a multirobot system is designed to introduce mobile environment features. By tracking and controlling the secondary robot, we can position the mobile feature throughout the environment to ensure a high confidence estimate, thus maintaining stability in the system. A laser rangefinder is the sensor the hovercraft uses to track the secondary robot, observe the environment, and facilitate successful localization and stability in motion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stress is a phenomenon that on some level affects everyone’s lives on a daily basis. The autonomic nervous system controls the varying levels of stress at any given time. The responses of the autonomic nervous system adjust the body to cope with changing external and internal conditions. During high-stress situations the body is forced into a state of heightened alertness, which passes when the stressor is removed. The stressor can be any external or internal event that causes the body to respond. Stress is a very versatile phenomenon that can be both a cause and an indicator of other medical conditions, for example cardiovascular disease. Stress detection can therefore be helpful in identifying these conditions and monitoring the overall emotional state of a person. Electrodermal activity (EDA) is one of the most easily implemented ways to monitor the activity of the autonomic nervous system. EDA describes changes occurring in the various electrical properties of the skin, including skin conductivity and resistance. Increased emotional sweating has been proven to be one possible indication of stress. On the surface of the skin, increased sweating translates to increased skin conductivity, which can be observed through EDA measurements. This makes electrodermal activity a very useful tool in a wide range of applications where it is desirable to observe changes in a person’s stress level. EDA can be recorded by using specialized body sensors placed on specific locations on the body. Most commonly used recording sites are the palms of the hands due to the high sweat gland density on those areas. Measurement is done using at least two electrodes attached to the skin, and recording the electrical conductance between them. This thesis implements a prototype of a wireless EDA measurement system. The feasibility of the prototype is also verified with a small group of test subjects. EDA was recorded from the subjects while they were playing a game of Tetris. The goal was to observe variations in the measured EDA that would indicate changes in the subjects’ stress levels during the game. The analysis of the obtained measurement results confirmed the connection between stress and recorded EDA. During the game, random occurrences of lowered skin resistance were clearly observable, which indicates points in the game where the player felt more anxious. A wireless measurement system has the potential of offering more flexible and comfortable long-term measuring of EDA, and could be utilized in a wide range of applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On the national scene, soybean crop occupies a prominent position in cultivated area and volume production, being cultivated largely in the no tillage system. This system, due to the intense traffic of machines and implements on its surface has caused soil compaction problems, which has caused the yield loss of crops. In order to minimize this effect the seeder-drill uses the systems to opening the furrow by shank or the double disc type. The use of the shank has become commonplace for allowing the disruption of the compacted surface layer, however requires greater energy demand and may cause excessive tillage in areas where there is not observed high levels of compaction. Thus, this study aimed to evaluate the effects of furrowers mechanisms and levels of soil compacting on traction requirement by a seeder-drill and on the growing and productivity of soybean in an Oxisol texture clay, in a two growing seasons. The experimental design consisted of randomized blocks with split plots with the main plots composed of four levels of soil compaction (N0 – no tillage without additional compaction, N1, N2 and N3 – no tillage subjected to compaction through two, four and six passes with tractor, respectively) corresponding to densities of soil 1.16, 1.20, 1.22 and 1.26 g cm-3, and subplots by two furrowers mechanisms (shank and double disc) with four replicates. To evaluate the average, maximum and specific traction force requested by the seeder-drill, was used a load cell, with capacity of 50 kN and sensitivity of 2 mV V-1, coupled between the tractor and seeder-drill, whose data are stored in a datalogger system model CR800 of Campbell Scientific. In addition, were evaluated the bulk density, soil mechanical resistance to penetration, sowing depth, depth and groove width, soil area mobilized, emergence speed index, emergence operation, final plant stand, stem diameter, plant height, average number of seeds per pod, weight of 1,000 seeds, number of pods per plant and crop productivity. Data were subjected to analysis of variance, the mean of furrowers were compared by Tukey test (p≤0.05), while for the factor soil compaction, polynomial regression analysis was adopted, selected models by the criterion of greater R2 and significance (p≤0.05) of equation parameters. Regardless of the crop season, penetration resistance increase as soil compaction levels up to around 0.20 m deep, and bulk density influenced the sowing quality parameters, however, did not affect the crop yield. In the first season, there was a higher productivity with the use of the shank type. In the second crop season, the shank demanded greater energetic requirement with the increase of bulk density and opposite situation with the double disc. The locking of sowing lines allow better performance of the shank to break the compacted layer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work investigates the using of kitchens of the apartments of PLANO 100 in Natal-RN, through one method s set of the functional and behavior evaluation. The theme was selected through of the emergence of the many questions that sought to understand what manners how individuals relates to the constructed space, what was the possible changes caused by these relation, also verifying how this space interferes in a daily life of theirs users. This research to search answers what to improve of the study s object and in futures production s architectonic too. The used approach combined an overview of new kinds of familiar arrangements and the evolution process of the brazilian s kitchens within social context of Brazil, with APO (Post-Occupation Evaluation), techniques through a physical space survey, questionnaires and interviews with users. Beyond to APO s implements were applied behavior setting s techniques too, what presented the most knowledge about to satisfactions levels pointed by the users

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dándole un nuevo significado al tópico de la ciudad muerta, especialmente productivo en la literatura de fin de siglo, Sorj Chalandon configura, en Le qua-trième mur, una poética del saqueo que hace de la ciudad devastada un espacio de creación, al mismo tiempo novelesco y trágico, donde reinventar el género de la tumba.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The persistence concern implemented as an aspect has been studied since the appearance of the Aspect-Oriented paradigm. Frequently, persistence is given as an example that can be aspectized, but until today no real world solution has applied that paradigm. Such solution should be able to enhance the programmer productivity and make the application less prone to errors. To test the viability of that concept, in a previous study we developed a prototype that implements Orthogonal Persistence as an aspect. This first version of the prototype was already fully functional with all Java types including arrays. In this work the results of our new research to overcome some limitations that we have identified on the data type abstraction and transparency in the prototype are presented. One of our goals was to avoid the Java standard idiom for genericity, based on casts, type tests and subtyping. Moreover, we also find the need to introduce some dynamic data type abilities. We consider that the Reflection is the solution to those issues. To achieve that, we have extended our prototype with a new static weaver that preprocesses the application source code in order to introduce changes to the normal behavior of the Java compiler with a new generated reflective code.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The investigation communities spread all over the planet create considerable amounts of knowledge while developing their activities. The most known way of communicating this knowledge is through papers publishing, books and reports, among others. These documents are stored in libraries and even more nowadays, in the Internet, allowing people to access up dated information more efficiently than ever. However, this great amount of available information can represent an obstacle to its transformation in knowledge. It is therefore relevant the implementation of mechanisms that enable this transformation. In this dissertation a proposal for creation of a new information service is presented. It concerns a service of collaborative bibliographic revision, which supports the investigators in this specific task as well as in registering its results, thus providing future revisions by third party. The developed model takes its grounds in a documental background accessible through a commercial libraries management system, under which a "web" of semantically enriched connections is established and that register preferential paths for exploring this background, according to multiple criteria as well as notes to documents and paths themselves. A prototype is presented it implements the fundamental ideas of the model, which at its basic level represents an access interface to a repository of documents in electronic form. Based on this level there are two others, focussed on the registration of information added by the users of the service. The results emerged from this prototype clearly show the viability of the model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent years have seen an astronomical rise in SQL Injection Attacks (SQLIAs) used to compromise the confidentiality, authentication and integrity of organisations’ databases. Intruders becoming smarter in obfuscating web requests to evade detection combined with increasing volumes of web traffic from the Internet of Things (IoT), cloud-hosted and on-premise business applications have made it evident that the existing approaches of mostly static signature lack the ability to cope with novel signatures. A SQLIA detection and prevention solution can be achieved through exploring an alternative bio-inspired supervised learning approach that uses input of labelled dataset of numerical attributes in classifying true positives and negatives. We present in this paper a Numerical Encoding to Tame SQLIA (NETSQLIA) that implements a proof of concept for scalable numerical encoding of features to a dataset attributes with labelled class obtained from deep web traffic analysis. In the numerical attributes encoding: the model leverages proxy in the interception and decryption of web traffic. The intercepted web requests are then assembled for front-end SQL parsing and pattern matching by applying traditional Non-Deterministic Finite Automaton (NFA). This paper is intended for a technique of numerical attributes extraction of any size primed as an input dataset to an Artificial Neural Network (ANN) and statistical Machine Learning (ML) algorithms implemented using Two-Class Averaged Perceptron (TCAP) and Two-Class Logistic Regression (TCLR) respectively. This methodology then forms the subject of the empirical evaluation of the suitability of this model in the accurate classification of both legitimate web requests and SQLIA payloads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Jerne's idiotypic network theory postulates that the immune response involves inter-antibody stimulation and suppression as well as matching to antigens. The theory has proved the most popular Artificial Immune System (AIS) model for incorporation into behavior-based robotics but guidelines for implementing idiotypic selection are scarce. Furthermore, the direct effects of employing the technique have not been demonstrated in the form of a comparison with non-idiotypic systems. This paper aims to address these issues. A method for integrating an idiotypic AIS network with a Reinforcement Learning based control system (RL) is described and the mechanisms underlying antibody stimulation and suppression are explained in detail. Some hypotheses that account for the network advantage are put forward and tested using three systems with increasing idiotypic complexity. The basic RL, a simplified hybrid AIS-RL that implements idiotypic selection independently of derived concentration levels and a full hybrid AIS-RL scheme are examined. The test bed takes the form of a simulated Pioneer robot that is required to navigate through maze worlds detecting and tracking door markers.