8 resultados para Simulation with multiple Consumers Profiles
em Digital Commons at Florida International University
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
This study evaluated inter- and intra-individual changes in acculturation, acculturative stress, and adaptation experiences, as well as their associations with adjustment outcomes among a group of Latino adolescents in South Florida. Specifically, the current study investigated the incidence, changes, and effects of stressors that arise from acculturation experiences (e.g., related to culture, discrimination, language difficulties) among Latino youth by employing a person-centered approach and a longitudinal research design. Four separate groups of analyses were conducted to investigate (a) within-group differences in levels of reported acculturative stress, (b) patterns of continuity and discontinuity in levels of acculturative stress across time, (c) adjustment outcomes associated with distinct patterns of acculturative stress within each measurement occasion, and (d) predictive relations between longitudinal acculturative stress trajectories in early adolescence and psychosocial adjustment outcomes in young adulthood. ^ Results from the multivariate analyses indicated great within group heterogeneity in acculturative stress among Latino youth during early adolescence, as well as significant continuity and discontinuity in the patterns of shifts among acculturative stress profiles between contiguous measurement occasions. Within each developmental period, membership in acculturative stress clusters was significantly and differentially associated with multiple adjustment outcomes, suggesting that maladaptive outcomes are more likely to occur among Latino adolescents experiencing high levels of psychological distress across multiple acculturative domains. In general, Latino youth acculturation is best understood as multi-dimensional, to be variable across time, and to be fluid and responsive to multiple factors and influences. Implications for preventive strategies are discussed with regard to acculturation and developmental psychology research literatures. ^
Resumo:
Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today's networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. ^ First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. ^ Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. ^ Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation.^
Resumo:
Chronic bronchopulmonary bacterial infections remain the most common cause of morbidity and mortality among patients with cystic fibrosis (CF). Recent community sequencing work has now shown that the bacterial community in the CF lung is polymicrobial. Identifying bacteria in the CF lung through sequencing can be costly and is not practical for many laboratories. Molecular techniques such as terminal restriction fragment length polymorphism or amplicon length heterogeneity-polymerase chain reaction (LH-PCR) can provide many laboratories with the ability to study CF bacterial communities without costly sequencing. The aim of this study was to determine if the use of LH-PCR with multiple hypervariable regions of the 16S rRNA gene could be used to identify organisms found in sputum DNA. This work also determined if LH-PCR could be used to observe the dynamics of lung infections over a period of time. Nineteen samples were analysed with the V1 and the V1_V2 region of the 16S rRNA gene. Based on the amplicon size present in the V1_V2 region, Pseudomonas aeruginosa was confirmed to be in all 19 samples obtained from the patients. The V1 region provided a higher power of discrimination between bacterial profiles of patients. Both regions were able to identify trends in the bacterial population over a period of time. LH profiles showed that the CF lung community is dynamic and that changes in the community may in part be driven by the patient's antibiotic treatment. LH-PCR is a tool that is well suited for studying bacterial communities and their dynamics.
Resumo:
With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.
Resumo:
Community ecology seeks to understand and predict the characteristics of communities that can develop under different environmental conditions, but most theory has been built on analytical models that are limited in the diversity of species traits that can be considered simultaneously. We address that limitation with an individual-based model to simulate assembly of fish communities characterized by life history and trophic interactions with multiple physiological tradeoffs as constraints on species performance. Simulation experiments were carried out to evaluate the distribution of 6 life history and 4 feeding traits along gradients of resource productivity and prey accessibility. These experiments revealed that traits differ greatly in importance for species sorting along the gradients. Body growth rate emerged as a key factor distinguishing community types and defining patterns of community stability and coexistence, followed by egg size and maximum body size. Dominance by fast-growing, relatively large, and fecund species occurred more frequently in cases where functional responses were saturated (i.e. high productivity and/or prey accessibility). Such dominance was associated with large biomass fluctuations and priority effects, which prevented richness from increasing with productivity and may have limited selection on secondary traits, such as spawning strategies and relative size at maturation. Our results illustrate that the distribution of species traits and the consequences for community dynamics are intimately linked and strictly dependent on how the benefits and costs of these traits are balanced across different conditions.
Resumo:
With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.