932 resultados para Input voltages


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project investigates machine listening and improvisation in interactive music systems with the goal of improvising musically appropriate accompaniment to an audio stream in real-time. The input audio may be from a live musical ensemble, or playback of a recording for use by a DJ. I present a collection of robust techniques for machine listening in the context of Western popular dance music genres, and strategies of improvisation to allow for intuitive and musically salient interaction in live performance. The findings are embodied in a computational agent – the Jambot – capable of real-time musical improvisation in an ensemble setting. Conceptually the agent’s functionality is split into three domains: reception, analysis and generation. The project has resulted in novel techniques for addressing a range of issues in each of these domains. In the reception domain I present a novel suite of onset detection algorithms for real-time detection and classification of percussive onsets. This suite achieves reasonable discrimination between the kick, snare and hi-hat attacks of a standard drum-kit, with sufficiently low-latency to allow perceptually simultaneous triggering of accompaniment notes. The onset detection algorithms are designed to operate in the context of complex polyphonic audio. In the analysis domain I present novel beat-tracking and metre-induction algorithms that operate in real-time and are responsive to change in a live setting. I also present a novel analytic model of rhythm, based on musically salient features. This model informs the generation process, affording intuitive parametric control and allowing for the creation of a broad range of interesting rhythms. In the generation domain I present a novel improvisatory architecture drawing on theories of music perception, which provides a mechanism for the real-time generation of complementary accompaniment in an ensemble setting. All of these innovations have been combined into a computational agent – the Jambot, which is capable of producing improvised percussive musical accompaniment to an audio stream in real-time. I situate the architectural philosophy of the Jambot within contemporary debate regarding the nature of cognition and artificial intelligence, and argue for an approach to algorithmic improvisation that privileges the minimisation of cognitive dissonance in human-computer interaction. This thesis contains extensive written discussions of the Jambot and its component algorithms, along with some comparative analyses of aspects of its operation and aesthetic evaluations of its output. The accompanying CD contains the Jambot software, along with video documentation of experiments and performances conducted during the project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-user single antenna multiple-input multiple-output orthogonal frequency division multiplexing (MUSA-MIMO-OFDM) is a promising technology to improve the spectrum efficiency of fixed wireless broadband access systems in rural areas. This letter investigates the capacity of MUSA-MIMO-OFDM uplink channel by theoretical, simulation, and empirical approaches considering up to six users. We propose an empirical capacity formula suitable for rural areas. Characteristics of channel capacity temporal variations and their relationship with the wind speed, observed in a rural area, are also presented in this letter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A distributed fuzzy system is a real-time fuzzy system in which the input, output and computation may be located on different networked computing nodes. The ability for a distributed software application, such as a distributed fuzzy system, to adapt to changes in the computing network at runtime can provide real-time performance improvement and fault-tolerance. This paper introduces an Adaptable Mobile Component Framework (AMCF) that provides a distributed dataflow-based platform with a fine-grained level of runtime reconfigurability. The execution location of small fragments (possibly as little as few machine-code instructions) of an AMCF application can be moved between different computing nodes at runtime. A case study is included that demonstrates the applicability of the AMCF to a distributed fuzzy system scenario involving multiple physical agents (such as autonomous robots). Using the AMCF, fuzzy systems can now be developed such that they can be distributed automatically across multiple computing nodes and are adaptable to runtime changes in the networked computing environment. This provides the opportunity to improve the performance of fuzzy systems deployed in scenarios where the computing environment is resource-constrained and volatile, such as multiple autonomous robots, smart environments and sensor networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Toolbox, combined with MATLAB ® and a modern workstation computer, is a useful and convenient environment for investigation of machine vision algorithms. For modest image sizes the processing rate can be sufficiently ``real-time'' to allow for closed-loop control. Focus of attention methods such as dynamic windowing (not provided) can be used to increase the processing rate. With input from a firewire or web camera (support provided) and output to a robot (not provided) it would be possible to implement a visual servo system entirely in MATLAB. Provides many functions that are useful in machine vision and vision-based control. Useful for photometry, photogrammetry, colorimetry. It includes over 100 functions spanning operations such as image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration and color space conversion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current study was motivated by statements made by the Economic Strategies Committee that Singapore’s recent productivity levels in services were well below countries such as the US, Japan and Hong Kong. Massive employment of foreign workers was cited as the reason for poor productivity levels. To shed more light on Singapore’s falling productivity, a nonparametric Malmquist productivity index was employed which provides measures of productivity change, technical change and efficiency change. The findings reveal that growth in Total Factor Productivity (TFP) was attributed to technical change with no improvement in efficiency change. Such results suggest that gains from TFP were input-driven rather than from a ‘best-practice’ approach such as improvements in operations or better resource allocation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel concept of producing high dc voltage for pulsed-power applications is proposed in this paper. The topology consists of an LC resonant circuit supplied through a tuned alternating waveform that is produced by an inverter. The control scheme is based on the detection of variations in the resonant frequency and adjustment of the switching signal patterns for the inverter to produce a square waveform with exactly the same frequencies. Therefore the capacitor voltage oscillates divergently with an increasing amplitude. A simple one-stage capacitor-diode voltage multiplier (CDVM) connected to the resonant capacitor then rectifies the alternating voltage and gives a dc level equal to twice the input voltage amplitude. The produced high voltage appears then in the form of high-voltage pulses across the load. A basic model is simulated by Simulink platform of MATLAB and the results are included in the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To (1) search the English-language literature for original research addressing the effect of cryotherapy on joint position sense (JPS) and (2) make recommendations regarding how soon healthy athletes can safely return to participation after cryotherapy. Data Sources: We performed an exhaustive search for original research using the AMED, CINAHL, MEDLINE, and SportDiscus databases from 1973 to 2009 to gather information on cryotherapy and JPS. Key words used were cryotherapy and proprioception, cryotherapy and joint position sense, cryotherapy, and proprioception. Study Selection: The inclusion criteria were (1) the literature was written in English, (2) participants were human, (3) an outcome measure included JPS, (4) participants were healthy, and (5) participants were tested immediately after a cryotherapy application to a joint. Data Extraction: The means and SDs of the JPS outcome measures were extracted and used to estimate the effect size (Cohen d) and associated 95% confidence intervals for comparisons of JPS before and after a cryotherapy treatment. The numbers, ages, and sexes of participants in all 7 selected studies were also extracted. Data Synthesis: The JPS was assessed in 3 joints: ankle (n 5 2), knee (n 5 3), and shoulder (n 5 2). The average effect size for the 7 included studies was modest, with effect sizes ranging from 20.08 to 1.17, with a positive number representing an increase in JPS error. The average methodologic score of the included studies was 5.4/10 (range, 5–6) on the Physiotherapy Evidence Database scale. Conclusions: Limited and equivocal evidence is available to address the effect of cryotherapy on proprioception in the form of JPS. Until further evidence is provided, clinicians should be cautious when returning individuals to tasks requiring components of proprioceptive input immediately after a cryotherapy treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commonwealth Scientific and Industrial Research Organization (CSIRO) has recently conducted a technology demonstration of a novel fixed wireless broadband access system in rural Australia. The system is based on multi user multiple-input multiple-output orthogonal frequency division multiplexing (MU-MIMO-OFDM). It demonstrated an uplink of six simultaneous users with distances ranging from 10 m to 8.5 km from a central tower, achieving 20 bits s/Hz spectrum efficiency. This paper reports on the analysis of channel capacity and bit error probability simulation based on the measured MUMIMO-OFDM channels obtained during the demonstration, and their comparison with the results based on channels simulated by a novel geometric optics based channel model suitable for MU-MIMO OFDM in rural areas. Despite its simplicity, the model was found to predict channel capacity and bit error rate probability accurately for a typical MU-MIMO-OFDM deployment scenario.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This note examines the productive efficiency of 62 starting guards during the 2011/12 National Basketball Association (NBA) season. This period coincides with the phenomenal and largely unanticipated performance of New York Knicks’ starting point guard Jeremy Lin and the attendant public and media hype known as Linsanity. We employ a data envelopment analysis (DEA) approach that includes allowance for an undesirable output, here turnovers per game, with the desirable outputs of points, rebounds, assists, steals, and blocks per game and an input of minutes per game. The results indicate that depending upon the specification, between 29 and 42 percent of NBA guards are fully efficient, including Jeremy Lin, with a mean inefficiency of 3.7 and 19.2 percent. However, while Jeremy Lin is technically efficient, he seldom serves as a benchmark for inefficient players, at least when compared with established players such as Chris Paul and Dwayne Wade. This suggests the uniqueness of Jeremy Lin’s productive solution and may explain why his unique style of play, encompassing individual brilliance, unselfish play, and team leadership, is of such broad public appeal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Residual amplitude modulation (RAM) mechanisms in electro-optic phase modulators are detrimental in applications that require high purity phase modulation of the incident laser beam. While the origins of RAMare not fully understood, measurements have revealed that it depends on the beam properties of the laser as well as the properties of the medium. Here we present experimental and theoretical results that demonstrate, for the first time, the dependence of RAM production in electro-optic phase modulators on beam intensity. The results show an order of magnitude increase in the level of RAM, around 10 dB, with a fifteenfold enhancement in the input intensity from 12 to 190 mW/mm 2. We show that this intensity dependent RAM is photorefractive in origin. © 2012 Optical Society of America.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soluble organic matter derived from exotic Pinus species has been shown to form stronger complexes with iron (Fe) than that derived from most native Australian species. It has also been proposed that the establishment of exotic Pinus plantations in coastal southeast Queensland may have enhanced the solubility of Fe in soils by increasing the amount of organically complexed Fe, but this remains inconclusive. In this study we test whether the concentration and speciation of Fe in soil water from Pinus plantations differs significantly from soil water from native vegetation areas. Both Fe redox speciation and the interaction between Fe and dissolved organic matter (DOM) were considered; Fe - DOM interaction was assessed using the Stockholm Humic Model. Iron concentrations (mainly Fe 2+) were greatest in the soil waters with the greatest DOM content collected from sandy podosols (Podzols), where they are largely controlled by redox potential. Iron concentrations were small in soil waters from clay and iron oxide-rich soils, in spite of similar redox potentials. This condition is related to stronger sorption on to the reactive clay and iron oxide mineral surfaces in these soils, which reduces the amount of DOM available for electron shuttling and microbial metabolism, restricting reductive dissolution of Fe. Vegetation type had no significant influence on the concentration and speciation of iron in soil waters, although DOM from Pinus sites had greater acidic functional group site densities than DOM from native vegetation sites. This is because Fe is mainly in the ferrous form, even in samples from the relatively well-drained podosols. However, modelling suggests that Pinus DOM can significantly increase the amount of truly dissolved ferric iron remaining in solution in oxic conditions. Therefore, the input of ferrous iron together with Pinus DOM to surface waters may reduce precipitation of hydrous ferric oxides (ferrihydrite) and increase the flux of dissolved Fe out of the catchment. Such inputs of iron are most probably derived from podosols planted with Pinus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The volcanic succession on Montserrat provides an opportunity to examine the magmatic evolution of island arc volcanism over a ∼2.5 Ma period, extending from the andesites of the Silver Hills center, to the currently active Soufrière Hills volcano (February 2010). Here we present high-precision double-spike Pb isotope data, combined with trace element and Sr-Nd isotope data throughout this period of Montserrat's volcanic evolution. We demonstrate that each volcanic center; South Soufrière Hills, Soufrière Hills, Centre Hills and Silver Hills, can be clearly discriminated using trace element and isotopic parameters. Variations in these parameters suggest there have been systematic and episodic changes in the subduction input. The SSH center, in particular, has a greater slab fluid signature, as indicated by low Ce/Pb, but less sediment addition than the other volcanic centers, which have higher Th/Ce. Pb isotope data from Montserrat fall along two trends, the Silver Hills, Centre Hills and Soufrière Hills lie on a general trend of the Lesser Antilles volcanics, whereas SSH volcanics define a separate trend. The Soufrière Hills and SSH volcanic centers were erupted at approximately the same time, but retain distinctive isotopic signatures, suggesting that the SSH magmas have a different source to the other volcanic centers. We hypothesize that this rapid magmatic source change is controlled by the regional transtensional regime, which allowed the SSH magma to be extracted from a shallower source. The Pb isotopes indicate an interplay between subduction derived components and a MORB-like mantle wedge influenced by a Galapagos plume-like source.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on a small-scale study, which looked into the impact of metacognitive instruction on listeners’ comprehension. Twenty-eight adult, Iranian, high-intermediate level EFL listeners participated in a “strategy-based” approach of advance organisation, directed attention, selective attention, and self-management in each of four listening lessons focused on improving listeners’ comprehension of IELTS listening texts. A comparison of pretest and posttest scores showed that the “less-skilled” listeners improved more than “more-skilled” listeners in the IELTS listening tests. Findings also supported the view that metacognitive instruction assisted listeners in considering the process of listening input and promoting listening comprehension ability.