975 resultados para fixed-time AI
Resumo:
What is the time-optimal way of using a set of control Hamiltonians to obtain a desired interaction? Vidal, Hammerer, and Cirac [Phys. Rev. Lett. 88, 237902 (2002)] have obtained a set of powerful results characterizing the time-optimal simulation of a two-qubit quantum gate using a fixed interaction Hamiltonian and fast local control over the individual qubits. How practically useful are these results? We prove that there are two-qubit Hamiltonians such that time-optimal simulation requires infinitely many steps of evolution, each infinitesimally small, and thus is physically impractical. A procedure is given to determine which two-qubit Hamiltonians have this property, and we show that almost all Hamiltonians do. Finally, we determine some bounds on the penalty that must be paid in the simulation time if the number of steps is fixed at a finite number, and show that the cost in simulation time is not too great.
Resumo:
The kinetics of naphthalene-2-sulfonic acid (2-NSA) adsorption by granular activated carbon (GAC) were measured and the relationships between adsorption, desorption, bioavailability and biodegradation assessed. The conventional Langmuir model fitted the experimental sorption isotherm data and introduced 2-NSA degrading bacteria, established on the surface of the GAC, did not interfere with adsorption. The potential value of GAC as a microbial support in the aerobic degradation of 2-NSA by Arthrobacter globiformis and Comamonas testosteroni was investigated. Using both virgin and microbially colonised GAC, adsorption removed 2-NSA from the liquid phase up to its saturation capacity of 140 mg/g GAC within 48 h. However, between 83.2% and 93.3% of the adsorbed 2-NSA was bioavailable to both bacterial species as a source of carbon for growth. In comparison to the non-inoculated GAC, the combination of rapid adsorption and biodegradation increased the amount (by 70–93%) of 2-NSA removal from the influent phase as well as the bed-life of the GAC (from 40 to >120 d). A microbially conditioned GAC fixed-bed reactor containing 15 g GAC removed 100% 2-NSA (100 mg/l) from tannery wastewater at an empty bed contact time of 22 min for a minimum of 120 d without the need for GAC reconditioning or replacement. This suggests that small volume GAC bioreactors could be used for tannery wastewater recycling.
Resumo:
Music plays an enormous role in today's computer games; it serves to elicit emotion, generate interest and convey important information. Traditional gaming music is fixed at the event level, where tracks loop until a state change is triggered. This behaviour however does not reflect musically the in-game state between these events. We propose a dynamic music environment, where music tracks adjust in real-time to the emotion of the in-game state. We are looking to improve the affective response to symbolic music through the modification of structural and performative characteristics through the application of rule-based techniques. In this paper we undertake a multidiscipline approach, and present a series of primary music-emotion structural rules for implementation. The validity of these rules was tested in small study involving eleven participants, each listening to six permutations from two musical works. Preliminary results indicate that the environment was generally successful in influencing the emotion of the musical works for three of the intended four directions (happier, sadder & content/dreamier). Our secondary aim of establishing that the use of music-emotion rules, sourced predominantly from Western classical music, could be applied with comparable results to modern computer gaming music was also largely successfully.
Resumo:
We have used MALDI-MS imaging (MALDI-MSI) to monitor the time dependent appearance and loss of signals when tissue slices are brought rapidly to room temperature for short to medium periods of time. Sections from mouse brain were cut in a cryostat microtome, placed on a MALDI target and allowed to warm to room temperature for 30 s to 3 h. Sections were then refrozen, fixed by ethanol treatment and analysed by MALDI-MSI. The intensity of a range of markers were seen to vary across the time course, both increasing and decreasing, with the intensity of some markers changing significantly within 30 s and markers also showed tissue location specific evolution. The markers resulting from this autolysis were compared directly to those that evolved in a comparable 16 h on-tissue trypsin digest, and the markers that evolved in the two studies were seen to be substantially different. These changes offer an important additional level of location-dependent information for mapping changes and seeking disease-dependent biomarkers in the tissue. They also indicate that considerable care is required to allow comparison of biomarkers between MALDI-MSI experiments and also has implications for the standard practice of thaw-mounting multiple tissue sections onto MALDI-MS targets.
Resumo:
In Statnote 9, we described a one-way analysis of variance (ANOVA) ‘random effects’ model in which the objective was to estimate the degree of variation of a particular measurement and to compare different sources of variation in space and time. The illustrative scenario involved the role of computer keyboards in a University communal computer laboratory as a possible source of microbial contamination of the hands. The study estimated the aerobic colony count of ten selected keyboards with samples taken from two keys per keyboard determined at 9am and 5pm. This type of design is often referred to as a ‘nested’ or ‘hierarchical’ design and the ANOVA estimated the degree of variation: (1) between keyboards, (2) between keys within a keyboard, and (3) between sample times within a key. An alternative to this design is a 'fixed effects' model in which the objective is not to measure sources of variation per se but to estimate differences between specific groups or treatments, which are regarded as 'fixed' or discrete effects. This statnote describes two scenarios utilizing this type of analysis: (1) measuring the degree of bacterial contamination on 2p coins collected from three types of business property, viz., a butcher’s shop, a sandwich shop, and a newsagent and (2) the effectiveness of drugs in the treatment of a fungal eye infection.
Resumo:
IEEE 802.16 standards have been developed as one of the technical solutions for broadband wireless access systems. It has high data rate, large network coverage, flexible QoS schemes and cheap network deployment. Various flexible mechanisms related to QoS provisioning have been specified for uplink traffic at the medium access control (MAC) layer in the standards. Among the mechanisms, contention based bandwidth request scheme can be used to indicate bandwidth demands to the base station for the non-real-time polling and besteffort services. These two services are used for most application with unknown traffic characteristics. Due to the diverse QoS requirements of those applications, service differentiation (SD) is anticipated over the contention based bandwidth request scheme. In this paper we investigate the SD with the bandwidth request scheme by means of assigning different channel access parameters and bandwidth allocation priorities. The effectiveness of the differentiation schemes are evaluated by simulations. It is observed that the initial backoff window can be efficient in SD, and if combined with the bandwidth allocation priority, the SD performances will be better. ©2008 IEEE.
Resumo:
In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.
Resumo:
The management of hypertension, dyslipidaemia and hyperglycaemia often requires multiple medications that combine two or more agents with different modes of action to give additive efficacy. In some situations lower doses of two agents with different modes of action can achieve greater efficacy than a high dose of one agent. This is achieved by addressing different pathophysiological features of the disease, whilst at the same time producing fewer side effects than a high dose of one agent. Several examples of this have been described for combinations of blood glucose-lowering therapies in type 2 diabetes. However, the pill burden associated with multiple medications can reduce patient adherence and compromise the potential value of the treatments. To reduce the number of daily doses, single-tablet (‘fixed-dose’) combinations have been introduced to offer greater convenience. There are several ant-diabetic FDCs, mostly combining metformin with another type of glucose-lowering agent. The UK has been less enthusiastic about FDCs than many other parts of the world, and does not have most of these combinations available. One of the concerns expressed about FDCs is a reduced flexibility to select desired doses of the two agents for dose titration. However, in practise the variety of dosage strengths for most FDCs matches the dosages available as separate tablets. Another concern has been the preference to add drugs one at a time to be able to attribute any adverse effects. In most cases the FDC is used when a second drug has been added to a monotherapy that is already a component of the FDC, so it is only the same as adding one agent but without increasing the pill burden.
Resumo:
Various flexible mechanisms related to quality of service (QoS) provisioning have been specified for uplink traffic at the medium access control (MAC) layer in the IEEE 802.16 standards. Among the mechanisms, contention based bandwidth request scheme can be used to indicate bandwidth demands to the base station for the non-real-time polling and best-effort services. These two services are used for most applications with unknown traffic characteristics. Due to the diverse QoS requirements of those applications, service differentiation (SD) is anticipated over the contention based bandwidth request scheme. In this paper we investigate the SD with the bandwidth request scheme by means of assigning different channel access parameters and bandwidth allocation priorities at different packets arrival probability. The effectiveness of the differentiation schemes is evaluated by simulations. It is observed that the initial backoff window can be efficient in SD, and if combined with the bandwidth allocation priority, the SD performances will be better.
Resumo:
This paper resolves the long standing debate as to the proper time scale τ of the onset of the immunological synapse bond, the noncovalent chemical bond defining the immune pathways involving T cells and antigen presenting cells. Results from our model calculations show τ to be of the order of seconds instead of minutes. Close to the linearly stable regime, we show that in between the two critical spatial thresholds defined by the integrin:ligand pair (Δ2∼ 40-45 nm) and the T-cell receptor TCR:peptide-major-histocompatibility-complex pMHC bond (Δ1∼ 14-15 nm), τ grows monotonically with increasing coreceptor bond length separation δ (= Δ2-Δ1∼ 26-30 nm) while τ decays with Δ1 for fixed Δ2. The nonuniversal δ-dependent power-law structure of the probability density function further explains why only the TCR:pMHC bond is a likely candidate to form a stable synapse.
Resumo:
* This work was financially supported by RFBR-04-01-00858.
Resumo:
Time is in constant motion: the present, the future and the past, although they are not concepts having a fixed meaning, they are present in everyday life both at the conscious and the unconscious levels. The author’s intention in this paper is to grasp the relationship of companies to time and to the future in the mature and nascent states of their life cycles. As discussed in this paper, this relationship may appear with little reflection in the form of assumptions in the eyes of strategy researchers and practitioners. At first the interrelatedness of theory and practice is discussed in order to focus on the role of scholars and practitioners in creating theory and putting it to practice or vice versa. This general introduction will lay the ground for the study of interpretations of the future and time from the perspective of strategy research and strategy practice, respectively.
Resumo:
For the past several decades, we have experienced the tremendous growth, in both scale and scope, of real-time embedded systems, thanks largely to the advances in IC technology. However, the traditional approach to get performance boost by increasing CPU frequency has been a way of past. Researchers from both industry and academia are turning their focus to multi-core architectures for continuous improvement of computing performance. In our research, we seek to develop efficient scheduling algorithms and analysis methods in the design of real-time embedded systems on multi-core platforms. Real-time systems are the ones with the response time as critical as the logical correctness of computational results. In addition, a variety of stringent constraints such as power/energy consumption, peak temperature and reliability are also imposed to these systems. Therefore, real-time scheduling plays a critical role in design of such computing systems at the system level. We started our research by addressing timing constraints for real-time applications on multi-core platforms, and developed both partitioned and semi-partitioned scheduling algorithms to schedule fixed priority, periodic, and hard real-time tasks on multi-core platforms. Then we extended our research by taking temperature constraints into consideration. We developed a closed-form solution to capture temperature dynamics for a given periodic voltage schedule on multi-core platforms, and also developed three methods to check the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research by incorporating the power/energy constraint with thermal awareness into our research problem. We investigated the energy estimation problem on multi-core platforms, and developed a computation efficient method to calculate the energy consumption for a given voltage schedule on a multi-core platform. In this dissertation, we present our research in details and demonstrate the effectiveness and efficiency of our approaches with extensive experimental results.
Resumo:
Il riconoscimento delle gesture è un tema di ricerca che sta acquisendo sempre più popolarità, specialmente negli ultimi anni, grazie ai progressi tecnologici dei dispositivi embedded e dei sensori. Lo scopo di questa tesi è quello di utilizzare alcune tecniche di machine learning per realizzare un sistema in grado di riconoscere e classificare in tempo reale i gesti delle mani, a partire dai segnali mioelettrici (EMG) prodotti dai muscoli. Inoltre, per consentire il riconoscimento di movimenti spaziali complessi, verranno elaborati anche segnali di tipo inerziale, provenienti da una Inertial Measurement Unit (IMU) provvista di accelerometro, giroscopio e magnetometro. La prima parte della tesi, oltre ad offrire una panoramica sui dispositivi wearable e sui sensori, si occuperà di analizzare alcune tecniche per la classificazione di sequenze temporali, evidenziandone vantaggi e svantaggi. In particolare, verranno considerati approcci basati su Dynamic Time Warping (DTW), Hidden Markov Models (HMM), e reti neurali ricorrenti (RNN) di tipo Long Short-Term Memory (LSTM), che rappresentano una delle ultime evoluzioni nel campo del deep learning. La seconda parte, invece, riguarderà il progetto vero e proprio. Verrà impiegato il dispositivo wearable Myo di Thalmic Labs come caso di studio, e saranno applicate nel dettaglio le tecniche basate su DTW e HMM per progettare e realizzare un framework in grado di eseguire il riconoscimento real-time di gesture. Il capitolo finale mostrerà i risultati ottenuti (fornendo anche un confronto tra le tecniche analizzate), sia per la classificazione di gesture isolate che per il riconoscimento in tempo reale.
Resumo:
Acknowledgements The Interdisciplinary Chronic Disease Collaboration (ICDC) is funded through the Alberta Heritage Foundation for Medical Research (AHFMR) Inter-disciplinary Team Grants Program. AHFMR is now Alberta Innovates – Health Solutions (AI-HS). The funding agreement ensured the authors’ independence in designing the study, interpreting the data, writing, and publishing the report. The Chief Scientist Office of the Scottish Government Health and Social Care Directorates funds HERU. The views expressed in this paper are those of the authors only and not those of the funding bodies.