998 resultados para Quantified real constraint


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to determine the size-resolved chemical composition of single particles in real-time an ATOFMS was deployed at urban background sites in Paris and Barcelona during the MEGAPOLI and SAPUSS monitoring campaigns respectively. The particle types detected during MEGAPOLI included several carbonaceous species, metal-containing types and sea-salt. Elemental carbon particle types were highly abundant, with 86% due to fossil fuel combustion and 14% attributed to biomass burning. Furthermore, 79% of the EC was apportioned to local emissions and 21% to continental transport. The carbonaceous particle types were compared with quantitative measurements from other instruments, and while direct correlations using particle counts were poor, scaling of the ATOFMS counts greatly improved the relationship. During SAPUSS carbonaceous species, sea-salt, dust, vegetative debris and various metal-containing particle types were identified. Throughout the campaign the site was influenced by air masses altering the composition of particles detected. During North African air masses the city was heavily influenced by Saharan dust. A regional stagnation was also observed leading to a large increase in carbonaceous particle counts. While the ATOFMS provides a list of particle types present during the measurement campaigns, the data presented is not directly quantitative. The quantitative response of the ATOFMS to metals was examined by comparing the ion signals within particle mass spectra and to hourly mass concentrations of; Na, K, Ca, Ti, V, Cr, Mn, Fe, Zn and Pb. The ATOFMS was found to have varying correlations with these metals depending on sampling issues such as matrix effects. The strongest correlations were observed for Al, Fe, Zn, Mn and Pb. Overall the results of this work highlight the excellent ability of the ATOFMS in providing composition and mixing state information on atmospheric particles at high time resolution. However they also show its limitations in delivering quantitative information directly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process guidance supports users to increase their process model understanding, process execution effectiveness as well as efficiency, and process compliance performance. This paper presents a research in progress encompassing our ongoing DSR project on Process Guidance Systems and a field evaluation of the resulting artifact in cooperation with a company. Building on three theory-grounded design principles, a Process Guidance System artifact for the company’s IT service ticketing process is developed, deployed and used. Fol-lowing a multi-method approach, we plan to evaluate the artifact in a longitudinal field study. Thereby, we will not only gather self-reported but also real usage data. This article describes the development of the artifact and discusses an innovative evaluation approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge sharing typically examines organizational transfer of knowledge, often from headquarters to subsidiaries, from developed country sites to emerging country sites, or from host to local employees. Yes, recent research, such as Prahalad’s Bottom of the Pyramid, raises the question of reverse transfer of knowledge, or whether knowledge could and should be transferred from local sites to home country sites within an organization. As several emerging economies build their capabilities in knowledge, research and development, marketing, and the like, it only makes sense to consider what type of knowledge and how to transfer it in reverse or bi-directional manners. This paper takes one step back in the process. Rather than focusing on what knowledge transfer may make sense within an organization, we consider what types of knowledge are important for foreigners to know at the initial stages of engagement abroad as they consider whether to do business in an emerging country.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

info:eu-repo/semantics/published

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an analytical method that yields the real and imaginary parts of the refractive index (RI) from low-coherence interferometry measurements, leading to the separation of the scattering and absorption coefficients of turbid samples. The imaginary RI is measured using time-frequency analysis, with the real part obtained by analyzing the nonlinear phase induced by a sample. A derivation relating the real part of the RI to the nonlinear phase term of the signal is presented, along with measurements from scattering and nonscattering samples that exhibit absorption due to hemoglobin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In some supply chains, materials are ordered periodically according to local information. This paper investigates how to improve the performance of such a supply chain. Specifically, we consider a serial inventory system in which each stage implements a local reorder interval policy; i.e., each stage orders up to a local basestock level according to a fixed-interval schedule. A fixed cost is incurred for placing an order. Two improvement strategies are considered: (1) expanding the information flow by acquiring real-time demand information and (2) accelerating the material flow via flexible deliveries. The first strategy leads to a reorder interval policy with full information; the second strategy leads to a reorder point policy with local information. Both policies have been studied in the literature. Thus, to assess the benefit of these strategies, we analyze the local reorder interval policy. We develop a bottom-up recursion to evaluate the system cost and provide a method to obtain the optimal policy. A numerical study shows the following: Increasing the flexibility of deliveries lowers costs more than does expanding information flow; the fixed order costs and the system lead times are key drivers that determine the effectiveness of these improvement strategies. In addition, we find that using optimal batch sizes in the reorder point policy and demand rate to infer reorder intervals may lead to significant cost inefficiency. © 2010 INFORMS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considerable scientific and intervention attention has been paid to judgment and decision-making systems associated with aggressive behavior in youth. However, most empirical studies have investigated social-cognitive correlates of stable child and adolescent aggressiveness, and less is known about real-time decision making to engage in aggressive behavior. A model of real-time decision making must incorporate both impulsive actions and rational thought. The present paper advances a process model (response evaluation and decision; RED) of real-time behavioral judgments and decision making in aggressive youths with mathematic representations that may be used to quantify response strength. These components are a heuristic to describe decision making, though it is doubtful that individuals always mentally complete these steps. RED represents an organization of social-cognitive operations believed to be active during the response decision step of social information processing. The model posits that RED processes can be circumvented through impulsive responding. This article provides a description and integration of thoughtful, rational decision making and nonrational impulsivity in aggressive behavioral interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Factors influencing apoptosis of vertebrate eggs and early embryos have been studied in cell-free systems and in intact embryos by analyzing individual apoptotic regulators or caspase activation in static samples. A novel method for monitoring caspase activity in living Xenopus oocytes and early embryos is described here. The approach, using microinjection of a near-infrared caspase substrate that emits fluorescence only after its proteolytic cleavage by active effector caspases, has enabled the elucidation of otherwise cryptic aspects of apoptotic regulation. In particular, we show that brief caspase activity (10 min) is sufficient to cause apoptotic death in this system. We illustrate a cytochrome c dose threshold in the oocyte, which is lowered by Smac, a protein that binds thereby neutralizing the inhibitor of apoptosis proteins. We show that meiotic oocytes develop resistance to cytochrome c, and that the eventual death of oocytes arrested in meiosis is caspase-independent. Finally, data acquired through imaging caspase activity in the Xenopus embryo suggest that apoptosis in very early development is not cell-autonomous. These studies both validate this assay as a useful tool for apoptosis research and reveal subtleties in the cell death program during early development. Moreover, this method offers a potentially valuable screening modality for identifying novel apoptotic regulators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gemstone Team Future Firefighting Advancements

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Anterior cruciate ligament (ACL) reconstruction is associated with a high incidence of second tears (graft tears and contralateral ACL tears). These secondary tears have been attributed to asymmetrical lower extremity mechanics. Knee bracing is one potential intervention that can be used during rehabilitation that has the potential to normalize lower extremity asymmetry; however, little is known about the effect of bracing on movement asymmetry in patients following ACL reconstruction. HYPOTHESIS: Wearing a knee brace would increase knee joint flexion and joint symmetry. It was also expected that the joint mechanics would become more symmetrical in the braced condition. OBJECTIVE: To examine how knee bracing affects knee joint function and symmetry over the course of rehabilitation in patients 6 months following ACL reconstruction. STUDY DESIGN: Controlled laboratory study. LEVEL OF EVIDENCE: Level 3. METHODS: Twenty-three adolescent patients rehabilitating from ACL reconstruction surgery were recruited for the study. The subjects all underwent a motion analysis assessment during a stop-jump activity with and without a functional knee brace on the surgical side that resisted extension for 6 months following the ACL reconstruction surgery. Statistical analysis utilized a 2 × 2 (limb × brace) analysis of variance with a significant alpha level of 0.05. RESULTS: Subjects had increased knee flexion on the surgical side when they were braced. The brace condition increased knee flexion velocity, decreased the initial knee flexion angle, and increased the ground reaction force and knee extension moment on both limbs. Side-to-side asymmetry was present across conditions for the vertical ground reaction force and knee extension moment. CONCLUSION: Wearing a knee brace appears to increase lower extremity compliance and promotes normalized loading on the surgical side. CLINICAL RELEVANCE: Knee extension constraint bracing in postoperative ACL patients may improve symmetry of lower extremity mechanics, which is potentially beneficial in progressing rehabilitation and reducing the incidence of second ACL tears.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. RESULTS: Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. CONCLUSIONS: Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An enterprise information system (EIS) is an integrated data-applications platform characterized by diverse, heterogeneous, and distributed data sources. For many enterprises, a number of business processes still depend heavily on static rule-based methods and extensive human expertise. Enterprises are faced with the need for optimizing operation scheduling, improving resource utilization, discovering useful knowledge, and making data-driven decisions.

This thesis research is focused on real-time optimization and knowledge discovery that addresses workflow optimization, resource allocation, as well as data-driven predictions of process-execution times, order fulfillment, and enterprise service-level performance. In contrast to prior work on data analytics techniques for enterprise performance optimization, the emphasis here is on realizing scalable and real-time enterprise intelligence based on a combination of heterogeneous system simulation, combinatorial optimization, machine-learning algorithms, and statistical methods.

On-demand digital-print service is a representative enterprise requiring a powerful EIS.We use real-life data from Reischling Press, Inc. (RPI), a digit-print-service provider (PSP), to evaluate our optimization algorithms.

In order to handle the increase in volume and diversity of demands, we first present a high-performance, scalable, and real-time production scheduling algorithm for production automation based on an incremental genetic algorithm (IGA). The objective of this algorithm is to optimize the order dispatching sequence and balance resource utilization. Compared to prior work, this solution is scalable for a high volume of orders and it provides fast scheduling solutions for orders that require complex fulfillment procedures. Experimental results highlight its potential benefit in reducing production inefficiencies and enhancing the productivity of an enterprise.

We next discuss analysis and prediction of different attributes involved in hierarchical components of an enterprise. We start from a study of the fundamental processes related to real-time prediction. Our process-execution time and process status prediction models integrate statistical methods with machine-learning algorithms. In addition to improved prediction accuracy compared to stand-alone machine-learning algorithms, it also performs a probabilistic estimation of the predicted status. An order generally consists of multiple series and parallel processes. We next introduce an order-fulfillment prediction model that combines advantages of multiple classification models by incorporating flexible decision-integration mechanisms. Experimental results show that adopting due dates recommended by the model can significantly reduce enterprise late-delivery ratio. Finally, we investigate service-level attributes that reflect the overall performance of an enterprise. We analyze and decompose time-series data into different components according to their hierarchical periodic nature, perform correlation analysis,

and develop univariate prediction models for each component as well as multivariate models for correlated components. Predictions for the original time series are aggregated from the predictions of its components. In addition to a significant increase in mid-term prediction accuracy, this distributed modeling strategy also improves short-term time-series prediction accuracy.

In summary, this thesis research has led to a set of characterization, optimization, and prediction tools for an EIS to derive insightful knowledge from data and use them as guidance for production management. It is expected to provide solutions for enterprises to increase reconfigurability, accomplish more automated procedures, and obtain data-driven recommendations or effective decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Duke University Medical Center Library and Archives is located in the heart of the Duke Medicine campus, surrounded by Duke Hospital, ambulatory clinics, and numerous research facilities. Its location is considered prime real estate, given its adjacency to patient care, research, and educational activities. In 2005, the Duke University Library Space Planning Committee had recommended creating a learning center in the library that would support a variety of educational activities. However, the health system needed to convert the library's top floor into office space to make way for expansion of the hospital and cancer center. The library had only five months to plan the storage and consolidation of its journal and book collections, while working with the facilities design office and architect on the replacement of key user spaces on the top floor. Library staff worked together to develop plans for storing, weeding, and consolidating the collections and provided input into renovation plans for users spaces on its mezzanine level. The library lost 15,238 square feet (29%) of its net assignable square footage and a total of 16,897 (30%) gross square feet. This included 50% of the total space allotted to collections and over 15% of user spaces. The top-floor space now houses offices for Duke Medicine oncology faculty and staff. By storing a large portion of its collection off-site, the library was able to remove more stacks on the remaining stack level and convert them to user spaces, a long-term goal for the library. Additional space on the mezzanine level had to be converted to replace lost study and conference room spaces. While this project did not match the recommended space plans for the library, it underscored the need for the library to think creatively about the future of its facility and to work toward a more cohesive master plan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI). During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets) or from non-viewed portions of the same game (foils). After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan's perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.