904 resultados para Single-process Models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays financial institutions due to regulation and internal motivations care more intensively on their risks. Besides previously dominating market and credit risk new trend is to handle operational risk systematically. Operational risk is the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. First we show the basic features of operational risk and its modelling and regulatory approaches, and after we will analyse operational risk in an own developed simulation model framework. Our approach is based on the analysis of latent risk process instead of manifest risk process, which widely popular in risk literature. In our model the latent risk process is a stochastic risk process, so called Ornstein- Uhlenbeck process, which is a mean reversion process. In the model framework we define catastrophe as breach of a critical barrier by the process. We analyse the distributions of catastrophe frequency, severity and first time to hit, not only for single process, but for dual process as well. Based on our first results we could not falsify the Poisson feature of frequency, and long tail feature of severity. Distribution of “first time to hit” requires more sophisticated analysis. At the end of paper we examine advantages of simulation based forecasting, and finally we concluding with the possible, further research directions to be done in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The auditory evoked N1m-P2m response complex presents a challenging case for MEG source-modelling, because symmetrical, phase-locked activity occurs in the hemispheres both contralateral and ipsilateral to stimulation. Beamformer methods, in particular, can be susceptible to localisation bias and spurious sources under these conditions. This study explored the accuracy and efficiency of event-related beamformer source models for auditory MEG data under typical experimental conditions: monaural and diotic stimulation; and whole-head beamformer analysis compared to a half-head analysis using only sensors from the hemisphere contralateral to stimulation. Event-related beamformer localisations were also compared with more traditional single-dipole models. At the group level, the event-related beamformer performed equally well as the single-dipole models in terms of accuracy for both the N1m and the P2m, and in terms of efficiency (number of successful source models) for the N1m. The results yielded by the half-head analysis did not differ significantly from those produced by the traditional whole-head analysis. Any localisation bias caused by the presence of correlated sources is minimal in the context of the inter-individual variability in source localisations. In conclusion, event-related beamformers provide a useful alternative to equivalent-current dipole models in localisation of auditory evoked responses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.

Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.

The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.

The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.

All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wheat (Triticum aestivum L.) has a long tradition as a raw material for the production of malt and beer. While breeding and cultivation efforts for barley have been highly successful in creating agronomically and brew- technical optimal specialty cultivars that have become well established as brewing barley varieties, the picture is completely different for brewing wheat. An increasing wheat beer demand results in a rising amount of raw material. Wheat has been - and still is – grown almost exclusively for the baking industry. It is this high demand that defines most of the wheat breeding objectives; and these objectives are generally not favourable in brewing industry. It is of major interest to screen wheat varieties for brewing processability and to give more focus to wheat as a brewing cereal. To obtain fast and reliable predications about the suitability of wheat cultivars a new mathematical method was developed in this work. The method allows a selection based on generally accepted quality characteristics. As selection criteria the parameters raw protein, soluble nitrogen, Kolbach index, extract and viscosity were chosen. During a triannual cultivation series, wheat varieties were evaluated on their suitability for brewing as well as stability to environmental conditions. To gain a fundamental understanding of the complex malting process, microstructural changes were evaluated and visualized by confocal laser scanning and scanning electron microscopy. Furthermore, changes observed in the micrographs were verified and endorsed by metabolic changes using established malt attributes. The degradation and formation of proteins during malting is essential for the final beer quality. To visualise fundamental protein changes taking place during malting, samples of each single process step were analysed and fractioned according their solubility. Protein fractions were analysed using a Lab-on-a-chip technique as well as OFFgel analysis. In general, a different protein distribution of wheat compared to barley or oat could be confirmed. During the malting process a degradation of proteins to small peptides and amino acids could be observed in all four Osborn fractions. Furthermore, in this study a protein profiling was performed to evaluate changes during the mashing process as well as the influence of grist composition. Differences in specific protein peaks and profile were detected for all samples during mashing. This study investigated the suitability of wheat for malting and brewing industry and closed the scientifical gap of amylolytic, cytolytic and proteolytic changes during malting and mashing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

143Nd/144Nd ratios have been determined on 37 samples of oceanic basalt, with a typical precision of +/- 2-3 * 10**-5 (2 sigma). Ocean island and dredged and cored submarine basalts are included for which reliable measurements of 87Sr/86Sr ratios exist in the literature or have been measured as part of this study. A strong negative correlation exists between 143Nd/144Nd and 87Sr/86Sr ratios in basalts from Iceland and the Reykjanes Ridge, but such a clear correlation does not exist for samples from the Hawaiian Islands. However, when other ocean island basalts from the Atlantic are included there is an overall correlation between these two parameters. Increases and decreases in Rb/Sr in oceanic basalt source regions have in general been accompanied by decreases and increases respectively in Sm/Nd ratios. The compatibility of the data with single-stage models is assessed and it is concluded that enrichment and depletion events, which are consistent with transfer of silicate melts, are responsible for the observed variation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The direct detection of a stellar system that explodes as a Type Ia supernova (SN Ia) has not yet been successful. Various indirect methods have been used to investigate SN Ia progenitor systems but none have produced conclusive results. A prediction of single-degenerate models is that H- (or He-) rich material from the envelope of the companion star should be swept up by the SN ejecta in the explosion. Seven SNe Ia have been analysed to date looking for signs of H-rich material in their late-time spectra and none were detected. We present results from new late-time spectra of 11 SNe Ia obtained at the Very Large Telescope using XShooter and FORS2. We present the tentative detection of Hα emission for SN 2013ct, corresponding to ∼0.007 M of stripped/ablated companion star material (under the assumptions of the spectral modelling). This mass is significantly lower than expected for single-degenerate scenarios, suggesting that >0.1 M of H-rich is present but not observed. We do not detect Hα emission in the other 10 SNe Ia. This brings the total sample of normal SNe Ia with non-detections (<0.001–0.058 M) of H-rich material to 17 events. The simplest explanation for these non-detections is that these objects did not result from the explosion of a CO white dwarf accreting matter from a H-rich companion star via Roche lobe overflow or symbiotic channels. However, further spectral modelling is needed to confirm this. We also find no evidence of He-emission features, but models with He-rich companion stars are not available to place mass limits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims/Purpose: Protocols are evidenced-based structured guides for directing care to achieve improvements. But translating that evidence into practice is a major challenge. It is not acceptable to simply introduce the protocol and expect it to be adopted and lead to change in practice. Implementation requires effective leadership and management. This presentation describes a strategy for implementation that should promote successful adoption and lead to practice change.
Presentation description: There are many social and behavioural change models to assist and guide practice change. Choosing a model to guide implementation is important for providing a framework for action. The change process requires careful thought, from the protocol itself to the policies and politics within the ICU. In this presentation, I discuss a useful pragmatic guide called the 6SQUID (6 Steps in QUality Intervention Development). This was initially designed for public health interventions, but the model has wider applicability and has similarities with other change process models. Steps requiring consideration include examining the purpose and the need for change; the staff that will be affected and the impact on their workload; and the evidence base supporting the protocol. Subsequent steps in the process that the ICU manager should consider are the change mechanism (widespread multi-disciplinary consultation; adapting the protocol to the local ICU); and identifying how to deliver the change mechanism (educational workshops and preparing staff for the changes are imperative). Recognising the barriers to implementation and change and addressing these locally is also important. Once the protocol has been implemented, there is generally a learning curve before it becomes embedded in practice. Audit and feedback on adherence are useful strategies to monitor and sustain the changes.
Conclusion: Managing change successfully will promote a positive experience for staff. In turn, this will encourage a culture of enthusiasm for translating evidence into practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In dieser Arbeit werden optische Filterarrays für hochqualitative spektroskopische Anwendungen im sichtbaren (VIS) Wellenlängenbereich untersucht. Die optischen Filter, bestehend aus Fabry-Pérot (FP)-Filtern für hochauflösende miniaturisierte optische Nanospektrometer, basieren auf zwei hochreflektierenden dielektrischen Spiegeln und einer zwischenliegenden Resonanzkavität aus Polymer. Jeder Filter erlaubt einem schmalbandigem spektralen Band (in dieser Arbeit Filterlinie genannt) ,abhängig von der Höhe der Resonanzkavität, zu passieren. Die Effizienz eines solchen optischen Filters hängt von der präzisen Herstellung der hochselektiven multispektralen Filterfelder von FP-Filtern mittels kostengünstigen und hochdurchsatz Methoden ab. Die Herstellung der multiplen Spektralfilter über den gesamten sichtbaren Bereich wird durch einen einzelnen Prägeschritt durch die 3D Nanoimprint-Technologie mit sehr hoher vertikaler Auflösung auf einem Substrat erreicht. Der Schlüssel für diese Prozessintegration ist die Herstellung von 3D Nanoimprint-Stempeln mit den gewünschten Feldern von Filterkavitäten. Die spektrale Sensitivität von diesen effizienten optischen Filtern hängt von der Genauigkeit der vertikalen variierenden Kavitäten ab, die durch eine großflächige ‚weiche„ Nanoimprint-Technologie, UV oberflächenkonforme Imprint Lithographie (UV-SCIL), ab. Die Hauptprobleme von UV-basierten SCIL-Prozessen, wie eine nichtuniforme Restschichtdicke und Schrumpfung des Polymers ergeben Grenzen in der potenziellen Anwendung dieser Technologie. Es ist sehr wichtig, dass die Restschichtdicke gering und uniform ist, damit die kritischen Dimensionen des funktionellen 3D Musters während des Plasmaätzens zur Entfernung der Restschichtdicke kontrolliert werden kann. Im Fall des Nanospektrometers variieren die Kavitäten zwischen den benachbarten FP-Filtern vertikal sodass sich das Volumen von jedem einzelnen Filter verändert , was zu einer Höhenänderung der Restschichtdicke unter jedem Filter führt. Das volumetrische Schrumpfen, das durch den Polymerisationsprozess hervorgerufen wird, beeinträchtigt die Größe und Dimension der gestempelten Polymerkavitäten. Das Verhalten des großflächigen UV-SCIL Prozesses wird durch die Verwendung von einem Design mit ausgeglichenen Volumen verbessert und die Prozessbedingungen werden optimiert. Das Stempeldesign mit ausgeglichen Volumen verteilt 64 vertikal variierenden Filterkavitäten in Einheiten von 4 Kavitäten, die ein gemeinsames Durchschnittsvolumen haben. Durch die Benutzung der ausgeglichenen Volumen werden einheitliche Restschichtdicken (110 nm) über alle Filterhöhen erhalten. Die quantitative Analyse der Polymerschrumpfung wird in iii lateraler und vertikaler Richtung der FP-Filter untersucht. Das Schrumpfen in vertikaler Richtung hat den größten Einfluss auf die spektrale Antwort der Filter und wird durch die Änderung der Belichtungszeit von 12% auf 4% reduziert. FP Filter die mittels des Volumengemittelten Stempels und des optimierten Imprintprozesses hergestellt wurden, zeigen eine hohe Qualität der spektralen Antwort mit linearer Abhängigkeit zwischen den Kavitätshöhen und der spektralen Position der zugehörigen Filterlinien.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The change in the economic world and the emergence of Internet as a tool for communication and integration among the markets have forced organizations to adopt a different structure, process-oriented with a focus on information management. Thus, information technology has gained prominence in the organizational context, increasing its complexity and range of services provided by this function. Moreover, outsourcing has become an important model for flexible corporate structure, helping organizations to achieve better results when carrying out their activities and processes and be more competitive. To make the IT outsourcing, it is necessary to follow certain steps that range from strategic assessment to the management of outsourced service. Such steps can influence the form of contracting services, varying the types of service providers and contractors. Thus, the study aimed to identify how this IT outsourcing process influences the use of models for contracting services. For this, a study was conducted in multiple cases study involving two companies in Rio Grande do Norte State, specifically the health sector. Data collection was carried out with the CIOs of the companies surveyed through semi-structured interviews. According to the results obtained, it was found that the outsourcing process more structured influences the use of a more advanced contracting model. However, there are features found in these steps carrying more clearly this influence, as the goals pursued by outsourcing, the criteria used in selecting the supplier, a contract negotiation, how to transition services and the use of methods management, but can vary depending on the level of maturity in the relationship of the companies examined. Moreover, it was found that the use of contracting model may also influence how it is developed the IT outsourcing process, requiring or not its more formalized and organization

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Käytettävien ohjelmistojen suunnittelu tuo hyötyjä loppukäyttäjälle sekä muille sidosryhmille. Verkkokaupassa käytettävyys on elintärkeää, koska asiakkaat vaihtavat helposti seuraavalle sivustolle, mikäli he eivät löydä etsimäänsä. Tutkimusten mukaan käytettävyys vaikuttaa ostopäätöksen tekemiseen. Lisäksi käytettävyydellä on merkitystä asiakastyytyväisyyteen, joka taas vaikuttaa asiakasuskollisuuteen. Tässä tutkielmassa tutkittiin, miten käytettävyyttä suunnitellaan käytännössä verrattuna teoreettisiin suosituksiin. Tapaustutkimuksen kohteena oli huonekaluja myyvän kansainvälisen yrityksen verkkokaupan uudistamiseen tähtäävä projekti. Uudistamistarve nousi aikaisemman verkkokauppaversion puutteellisesta käytettävyydestä. Projekti toteutettiin ketterällä Scrum-menetelmällä. Empiirinen aineisto kerättiin puolistrukturoitujen haastattelujen avulla. Haastateltavat olivat käyttökokemuksen suunnitteluun osallistuvia henkilöitä. Haastattelujen teemat laadittiin teoreettisen aineiston pohjalta. Teoreettisessa osuudessa tutkittiin käytettävyyden suunnitteluun liittyviä periaatteita, prosessia ja menetelmiä. Aikaisemmasta tutkimuksesta löydettiin 12 periaatetta, jotka tukevat ja luonnehtivat käyttäjäkeskeistä suunnittelua. Käytettävyyttä suunnitellaan käyttäjäkeskeisen prosessin avulla. Eri prosessimallit pitivät keskeisinä asioina käyttökontekstin määrittelyä ja ymmärtämistä, mitattavia käytettävyysvaatimuksia, suunnitteluratkaisujen empiiristä arviointia sekä suunnitteluprosessin iteratiivisuutta. Lisäksi tarkasteltiin, mitä suunnittelumenetelmiä tutkijat ehdottavat käytettävyyden suunnitteluun ja mitä kyselytutkimusten perusteella todellisuudessa käytetään. Verkkokauppaprojektissa käytettävyyden suunnittelu erosi osittain teoreettisista suosituksista. Käyttökontekstitietoa ei ollut kaikilla projektiin osallistuvilla, eikä käytettävyysvaatimuksia ollut asetettu teorian tarkoittamalla tavalla. Yhtäläisyyksiäkin löytyi. Verkkokauppaprojektissa suunnitteluratkaisuja arvioitiin empiirisesti todellisten käyttäjien edustajien avulla. Suunnitteluprosessi oli iteratiivinen eli suunnitteluratkaisuja oltiin valmiita muuttamaan arvioinnin tuloksena. Tutkimuksen perusteella suositellaan, että verkkokauppaprojektissa parannettaisiin kommunikointia, koska käyttökontekstitieto ei saavuttanut kaikkia projektissa työskenteleviä. Teorian tulisi entisestään korostaa kommunikoinnin tärkeyttä. Tutkimuksen perusteella esitetään myös, että teoria ohjaisi paremmin vaatimusmäärittelyjen tekemiseen käytännössä. Avainsanat: Käytettävyys, käyttäjäkeskeinen suunnittelu, käytettävyyden periaatteet, käytettävyyden suunnittelumenetelmät, ketterä ohjelmistokehitys, tapaustutkimus

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The change in the economic world and the emergence of Internet as a tool for communication and integration among the markets have forced organizations to adopt a different structure, process-oriented with a focus on information management. Thus, information technology has gained prominence in the organizational context, increasing its complexity and range of services provided by this function. Moreover, outsourcing has become an important model for flexible corporate structure, helping organizations to achieve better results when carrying out their activities and processes and be more competitive. To make the IT outsourcing, it is necessary to follow certain steps that range from strategic assessment to the management of outsourced service. Such steps can influence the form of contracting services, varying the types of service providers and contractors. Thus, the study aimed to identify how this IT outsourcing process influences the use of models for contracting services. For this, a study was conducted in multiple cases study involving two companies in Rio Grande do Norte State, specifically the health sector. Data collection was carried out with the CIOs of the companies surveyed through semi-structured interviews. According to the results obtained, it was found that the outsourcing process more structured influences the use of a more advanced contracting model. However, there are features found in these steps carrying more clearly this influence, as the goals pursued by outsourcing, the criteria used in selecting the supplier, a contract negotiation, how to transition services and the use of methods management, but can vary depending on the level of maturity in the relationship of the companies examined. Moreover, it was found that the use of contracting model may also influence how it is developed the IT outsourcing process, requiring or not its more formalized and organization

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Part 13: Virtual Reality and Simulation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intracochlear trauma from surgical insertion of bulky electrode arrays and inadequate pitch perception are areas of concern with current hand-assembled commercial cochlear implants. Parylene thin-film arrays with higher electrode densities and lower profiles are a potential solution, but lack rigidity and hence depend on manually fabricated permanently attached polyethylene terephthalate (PET) tubing based bulky backing devices. As a solution, we investigated a new backing device with two sub-systems. The first sub-system is a thin poly(lactic acid) (PLA) stiffener that will be embedded in the parylene array. The second sub-system is an attaching and detaching mechanism, utilizing a poly(N-vinylpyrrolidone)-block-poly(d,l-lactide) (PVP-b-PDLLA) copolymer-based biodegradable and water soluble adhesive, that will help to retract the PET insertion tool after implantation. As a proof-of-concept of sub-system one, a microfabrication process for patterning PLA stiffeners embedded in parylene has been developed. Conventional hotembossing, mechanical micromachining, and standard cleanroom processes were integrated for patterning fully released and discrete stiffeners coated with parylene. The released embedded stiffeners were thermoformed to demonstrate that imparting perimodiolar shapes to stiffener-embedded arrays will be possible. The developed process when integrated with the array fabrication process will allow fabrication of stiffener-embedded arrays in a single process. As a proof-of-concept of sub-system two, the feasibility of the attaching and detaching mechanism was demonstrated by adhering 1x and 1.5x scale PET tube-based insertion tools and PLA stiffeners embedded in parylene using the copolymer adhesive. The attached devices survived qualitative adhesion tests, thermoforming, and flexing. The viability of the detaching mechanism was tested by aging the assemblies in-vitro in phosphate buffer solution. The average detachment times, 2.6 minutes and 10 minutes for 1x and 1.5x scale devices respectively, were found to be clinically relevant with respect to the reported array insertion times during surgical implantation. Eventually, the stiffener-embedded arrays would not need to be permanently attached to current insertion tools which are left behind after implantation and congest the cochlear scala tympani chamber. Finally, a simulation-based approach for accelerated failure analysis of PLA stiffeners and characterization of PVP-b-PDLLA copolymer adhesive has been explored. The residual functional life of embedded PLA stiffeners exposed to body-fluid and thereby subjected to degradation and erosion has been estimated by simulating PLA stiffeners with different parylene coating failure types and different PLA types for a given parylene coating failure type. For characterizing the PVP-b-PDLLA copolymer adhesive, several formulations of the copolymer adhesive were simulated and compared based on the insertion tool detachment times that were predicted from the dissolution, degradation, and erosion behavior of the simulated adhesive formulations. Results indicate that the simulation-based approaches could be used to reduce the total number of time consuming and expensive in-vitro tests that must be conducted.