979 resultados para Distributed Production
Resumo:
This paper describes a neural model of speech acquisition and production that accounts for a wide range of acoustic, kinematic, and neuroimaging data concerning the control of speech movements. The model is a neural network whose components correspond to regions of the cerebral cortex and cerebellum, including premotor, motor, auditory, and somatosensory cortical areas. Computer simulations of the model verify its ability to account for compensation to lip and jaw perturbations during speech. Specific anatomical locations of the model's components are estimated, and these estimates are used to simulate fMRI experiments of simple syllable production with and without jaw perturbations.
Resumo:
This article describes the VITEWRITE model for generating handwriting movements. The model consists of a sequential controller, or motor program, that interacts with a trajectory generator to move a hand with redundant degrees of freedom. The neural trajectory generator is the Vector Integration to Endpoint (VITE) model for synchronous variable-speed control of multijoint movements. VITE properties enable a simple control strategy to generate complex handwritten script if the hand model contains redundant degrees of freedom. The controller launches transient directional commands to independent hand synergies at times when the hand begins to move, or when a velocity peak in the outflow command to a given synergy occurs. The VITE model translates these temporally disjoint synergy commands into smooth curvilinear trajectories among temporally overlapping synergetic movements. Each synergy exhibits a unimodal velocity profile during any stroke, generates letters that are invariant under speed and size rescaling, and enables effortless connection of letter shapes into words. Speed and size rescaling are achieved by scalar GO and GRO signals that express computationally simple volitional commands. Psychophysical data such as the isochrony principle, asymmetric velocity profiles, and the two-thirds power law relating movement curvature and velocity arise as emergent properties of model interactions.
Resumo:
This paper describes a model of speech production called DIVA that highlights issues of self-organization and motor equivalent production of phonological units. The model uses a circular reaction strategy to learn two mappings between three levels of representation. Data on the plasticity of phonemic perceptual boundaries motivates a learned mapping between phoneme representations and vocal tract variables. A second mapping between vocal tract variables and articulator movements is also learned. To achieve the flexible control made possible by the redundancy of this mapping, desired directions in vocal tract configuration space are mapped into articulator velocity commands. Because each vocal tract direction cell learns to activate several articulator velocities during babbling, the model provides a natural account of the formation of coordinative structures. Model simulations show automatic compensation for unexpected constraints despite no previous experience or learning under these constraints.
Resumo:
This article describes a neural network model that addresses the acquisition of speaking skills by infants and subsequent motor equivalent production of speech sounds. The model learns two mappings during a babbling phase. A phonetic-to-orosensory mapping specifies a vocal tract target for each speech sound; these targets take the form of convex regions in orosensory coordinates defining the shape of the vocal tract. The babbling process wherein these convex region targets are formed explains how an infant can learn phoneme-specific and language-specific limits on acceptable variability of articulator movements. The model also learns an orosensory-to-articulatory mapping wherein cells coding desired movement directions in orosensory space learn articulator movements that achieve these orosensory movement directions. The resulting mapping provides a natural explanation for the formation of coordinative structures. This mapping also makes efficient use of redundancy in the articulator system, thereby providing the model with motor equivalent capabilities. Simulations verify the model's ability to compensate for constraints or perturbations applied to the articulators automatically and without new learning and to explain contextual variability seen in human speech production.
Resumo:
Advanced Research Projects Agency (ONR N00014-92-J-4015); Office of Naval Research (N00014-91-J-4100, N00014-92-J-1309)
Resumo:
This article describes a neural network model, called the VITEWRITE model, for generating handwriting movements. The model consists of a sequential controller, or motor program, that interacts with a trajectory generator to move a. hand with redundant degrees of freedom. The neural trajectory generator is the Vector Integration to Endpoint (VITE) model for synchronous variable-speed control of multijoint movements. VITE properties enable a simple control strategy to generate complex handwritten script if the hand model contains redundant degrees of freedom. The proposed controller launches transient directional commands to independent hand synergies at times when the hand begins to move, or when a velocity peak in a given synergy is achieved. The VITE model translates these temporally disjoint synergy commands into smooth curvilinear trajectories among temporally overlapping synergetic movements. The separate "score" of onset times used in most prior models is hereby replaced by a self-scaling activity-released "motor program" that uses few memory resources, enables each synergy to exhibit a unimodal velocity profile during any stroke, generates letters that are invariant under speed and size rescaling, and enables effortless. connection of letter shapes into words. Speed and size rescaling are achieved by scalar GO and GRO signals that express computationally simple volitional commands. Psychophysical data concerning band movements, such as the isochrony principle, asymmetric velocity profiles, and the two-thirds power law relating movement curvature and velocity arise as emergent properties of model interactions.
Resumo:
Petrochemical plastics/polymers are a common feature of day to day living as they occur in packaging, furniture, mobile phones, computers, construction equipment etc. However, these materials are produced from non-renewable materials and are resistant to microbial degradation in the environment. Considerable research has therefore been carried out into the production of sustainable, biodegradable polymers, amenable to microbial catabolism to CO2 and H2O. A key group of microbial polyesters, widely considered as optimal replacement polymers, are the Polyhydroxyalkaonates (PHAs). Primary research in this area has focused on using recombinant pure cultures to optimise PHA yields, however, despite considerable success, the high costs of pure culture fermentation have thus far hindered the commercial viability of PHAs thus produced. In more recent years work has begun to focus on mixed cultures for the optimisation of PHA production, with waste incorporations offering optimal production cost reductions. The scale of dairy processing in Ireland, and the high organic load wastewaters generated, represent an excellent potential substrate for bioconversion to PHAs in a mixed culture system. The current study sought to investigate the potential for such bioconversion in a laboratory scale biological system and to establish key operational and microbial characteristics of same. Two sequencing batch reactors were set up and operated along the lines of an enhanced biological phosphate removal (EBPR) system, which has PHA accumulation as a key step within repeated rounds of anaerobic/aerobic cycling. Influents to the reactors varied only in the carbon sources provided. Reactor 1 received artificial wastewater with acetate alone, which is known to be readily converted to PHA in the anaerobic step of EBPR. Reactor 2 wastewater influent contained acetate and skim milk to imitate a dairy processing effluent. Chemical monitoring of nutrient remediation within the reactors as continuously applied and EBPR consistent performances observed. Qualitative analysis of the sludge was carried out using fluorescence microscopy with Nile Blue A lipophillic stain and PHA production was confirmed in both reactors. Quantitative analysis via HPLC detection of crotonic acid derivatives revealed the fluorescence to be short chain length Polyhydroxybutyrate, with biomass dry weight accumulations of 11% and 13% being observed in reactors 1 and 2, respectively. Gas Chromatography-Mass Spectrometry for medium chain length methyl ester derivatives revealed the presence of hydroxyoctanoic, -decanoic and -dodecanoic acids in reactor 1. Similar analyses in reactor 2 revealed monomers of 3-hydroxydodecenoic and 3-hydroxytetradecanoic acids. Investigation of the microbial ecology of both reactors as conducted in an attempt to identify key species potentially contributing to reactor performance. Culture dependent investigations indicated that quite different communities were present in both reactors. Reactor 1 isolates demonstrated the following species distributions Pseudomonas (82%), Delftia acidovorans (3%), Acinetobacter sp. (5%) Aminobacter sp., (3%) Bacillus sp. (3%), Thauera sp., (3%) and Cytophaga sp. (3%). Relative species distributions among reactor 2 profiled isolates were more evenly distributed between Pseudoxanthomonas (32%), Thauera sp (24%), Acinetobacter (24%), Citrobacter sp (8%), Lactococcus lactis (5%), Lysinibacillus (5%) and Elizabethkingia (2%). In both reactors Gammaproteobacteria dominated the cultured isolates. Culture independent 16S rRNA gene analyses revealed differing profiles for both reactors. Reactor 1 clone distribution was as follows; Zooglea resiniphila (83%), Zooglea oryzae (2%), Pedobacter composti (5%), Neissericeae sp. (2%) Rhodobacter sp. (2%), Runella defluvii (3%) and Streptococcus sp. (3%). RFLP based species distribution among the reactor 2 clones was as follows; Runella defluvii (50%), Zoogloea oryzae (20%), Flavobacterium sp. (9%), Simplicispira sp. (6%), Uncultured Sphingobacteria sp. (6%), Arcicella (6%) and Leadbetterella bysophila (3%). Betaproteobacteria dominated the 16S rRNA gene clones identified in both reactors. FISH analysis with Nile Blue dual staining resolved these divergent findings, identifying the Betaproteobacteria as dominant PHA accumulators within the reactor sludges, although species/strain specific allocations could not be made. GC analysis of the sludge had indicated the presence of both medium chain length as well short chain length PHAs accumulating in both reactors. In addition the cultured isolates from the reactors had been identified previously as mcl and scl PHA producers, respectively. Characterisations of the PHA monomer profiles of the individual isolates were therefore performed to screen for potential novel scl-mcl PHAs. Nitrogen limitation driven PHA accumulation in E2 minimal media revealed a greater propensity among isoates for mcl-pHA production. HPLC analysis indicated that PHB production was not a major feature of the reactor isolates and this was supported by the low presence of scl phaC1 genes among PCR screened isolates. A high percentage distribution of phaC2 mcl-PHA synthase genes was recorded, with the majority sharing high percentage homology with class II synthases from Pseudomonas sp. The common presence of a phaC2 homologue was not reflected in the production of a common polymer. Considerable variation was noted in both the monomer composition and ratios following GC analysis. While co-polymer production could not be demonstrated, potentially novel synthase substrate specificities were noted which could be exploited further in the future.
Resumo:
Consumer demand is revolutionizing the way products are being produced, distributed and marketed. In relation to the dairy sector in developing countries, aspects of milk quality are receiving more attention from both society and the government. However, milk quality management needs to be better addressed in dairy production systems to guarantee the access of stakeholders, mainly small-holders, into dairy markets. The present study is focused on an analysis of the interaction of the upstream part of the dairy supply chain (farmers and dairies) in the Mantaro Valley (Peruvian central Andes), in order to understand possible constraints both stakeholders face implementing milk quality controls and practices; and evaluate “ex-ante” how different strategies suggested to improve milk quality could affect farmers and processors’ profits. The analysis is based on three complementary field studies conducted between 2012 and 2013. Our work has shown that the presence of a dual supply chain combining both formal and informal markets has a direct impact on dairy production at the technical and organizational levels, affecting small formal dairy processors’ possibilities to implement contracts, including agreements on milk quality standards. The analysis of milk quality management from farms to dairy plants highlighted the poor hygiene in the study area, even when average values of milk composition were usually high. Some husbandry practices evaluated at farm level demonstrated cost effectiveness and a big impact on hygienic quality; however, regular application of these practices was limited, since small-scale farmers do not receive a bonus for producing hygienic milk. On the basis of these two results, we co-designed with formal small-scale dairy processors a simulation tool to show prospective scenarios, in which they could select their best product portfolio but also design milk payment systems to reward farmers’ with high milk quality performances. This type of approach allowed dairy processors to realize the importance of including milk quality management in their collection and manufacturing processes, especially in a context of high competition for milk supply. We concluded that the improvement of milk quality in a smallholder farming context requires a more coordinated effort among stakeholders. Successful implementation of strategies will depend on the willingness of small-scale dairy processors to reward farmers producing high milk quality; but also on the support from the State to provide incentives to the stakeholders in the formal sector.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
The television and film industries are used to working on large projects. These projects use media and documents of various types, ranging from actual film and videotape to items such as PERT charts for project planning. Some items, such as scripts, evolve over a period and go through many versions. It is often necessary to attach information to these “objects” in order to manage, track, and retrieve them. On large productions there may be hundreds of personnel who need access to this material and who in their turn generate new items which form some part of the final production. The requirements for this industry in terms of an information system may be generalized and a distributed software architecture built, primarily using the internet, to serve the needs of these projects. This architecture must enable potentially very large collections of objects to be managed in a secure environment with distributed responsibilities held by many working on the production. Copyright © 2005 by the Society of Motion Picture and Television Engineers, Inc.
Resumo:
PEGS (Production and Environmental Generic Scheduler) is a generic production scheduler that produces good schedules over a wide range of problems. It is centralised, using search strategies with the Shifting Bottleneck algorithm. We have also developed an alternative distributed approach using software agents. In some cases this reduces run times by a factor of 10 or more. In most cases, the agent-based program also produces good solutions for published benchmark data, and the short run times make our program useful for a large range of problems. Test results show that the agents can produce schedules comparable to the best found so far for some benchmark datasets and actually better schedules than PEGS on our own random datasets. The flexibility that agents can provide for today's dynamic scheduling is also appealing. We suggest that in this sort of generic or commercial system, the agent-based approach is a good alternative.
Resumo:
This paper presents the results of experimental study of passive intermodulation (PIM) generation in microstrip lines with U-shaped and meandered strips, impedance tapers, and strips with the profiled edges. It is shown that the geometrical discontinuities in printed circuits may have a noticeable impact on distributed PIM generation even when their effect is indiscernible in the linear regime measurements. A consistent interpretation of the observed phenomena has been proposed on the basis of the phase synchronism in the four-wave mixing process. The results of this study reveal new features of PIM production important for the design and characterization of low-PIM microstrip circuits. © 2010 IEEE.
Resumo:
Researchers want to analyse Health Care data which may requires large pools of compute and data resources. To have them they need access to Distributed Computing Infrastructures (DCI). To use them it requires expertise which researchers may not have. Workflows can hide infrastructures. There are many workflow systems but they are not interoperable. To learn a workflow system and create workflows in a workflow system may require significant effort. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows of other workflow systems. As a result, the lack of interoperability prevents workflow sharing and a vast amount of research efforts is wasted. The FP7 Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs (SHIWA) project developed the Coarse-Grained Interoperability (CGI) to enable workflow sharing. The project created the SHIWA Simulation Platform (SSP) to support CGI as a production-level service. The paper describes how the CGI approach can be used for analysis and simulation in Health Care.
Resumo:
Affiliation: Florina Moldovan: Faculté de médecine dentaire, Université de Montréal & CHU Hôpital Sainte-Justine, Université de Montréal. Christina Alexandra Manacu, Marjolaine Roy-Beaudry, Fazool Shipkolye : CHU Hôpital Sainte-Justine, Université de Montréal. Johanne Martel-Pelletier & Jean-Pierre Pelletier : CHUM Hôpital Notre-Dame, Université de Montréal.
Resumo:
In the light of the very huge demand for natural ephedrine and pseudoephidrine, a search for an angiosperm plant containing the alkaloid ephedrine was made and could locate Sida spp. of malvaceae family. Sida is a large genus of, herbs and shrubs distributed throughout the tropics. About a dozen species occur in India. The medicinally important species known are S.rhombrfolia S.cordata and S.spinosa (Anon, 1972). Among the various species, S.rh0mbIfolia is the most widely used one in the traditional system of medicine. An attempt was made in the present study to develop an ideal bioprocess for the in vitro production of ephedrine from the cell culture system of Sida rhombrfolia Linn. ssp. retusa. The callus and suspension culture were initiated and attempts were made to enhance the yield positively by employing various strategies like mutagenesis, immobilization and addition of precursors, elicitors and penneabilizing agents.