917 resultados para Modeling and control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to identify and analyze the basic causes of food service employee turnover in five selected restaurants in the Miami area. The withdrawal behavior in this study is treated in terms of controllable turnover, for the purpose of management, learning more about what action to take to solve this problem which has eaten into the fabric of the hospitality industry. The aim is to find out from the food service employees and management view of work for the purpose of identifying the variables which cause an employee to voluntarily leave a job. The objective is therefore, to analyze and describe the problem of labor turnover in these selected restaurants. Such description must precede efforts to arrive at solutions to the problem if these efforts are ever to be more than haphazard and superficial. Sigmund Freud once stated: "The true beginning of scientific activity consists in describing phenomena and only then in proceeding to group, classify and correlate them."1 The nature of the study is basically descriptive survey. Data is collected by the use of management questionnaire, food service employee questionnaire and finally employees job description index. The survey consisted of a series of well defined questions with open and closed endings dealing with employee with employee turnover. As Robert Ferber and P. J. Verdoom state in their book titled Research Method in Economics of Business: "Structured questionnaires, by supplying question formulations in very specific terms as well as the different possible answers are easier for the sample members to answer and also serve to reduce the danger of interviewer bias."2 The answers to the prepared questionnaire by sample members were then recorded. The results of the questionnaire responses were then compiled for presentation and analysis. 1 Julian Simon, Basic Research Methods in Social Science. Random House, New York, 1969, p.53. 2 Robert J. Ferber and P.J. Verdoon, Research Methods in Economics and Business, The McMillan Company, 1962, p. 20 9 .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Present theories of deep-sea community organization recognize the importance of small-scale biological disturbances, originated partly from the activities of epibenthic megafaunal organisms, in maintaining high benthic biodiversity in the deep sea. However, due to technical difficulties, in situ experimental studies to test hypotheses in the deep sea are lacking. The objective of the present study was to evaluate the potential of cages as tools for studying the importance of epibenthic megafauna for deep-sea benthic communities. Using the deep-diving Remotely Operated Vehicle (ROV) "VICTOR 6000", six experimental cages were deployed at the sea floor at 2500 m water depth and sampled after 2 years (2y) and 4 years (4y) for a variety of sediment parameters in order to test for caging artefacts. Photo and video footage from both experiments showed that the cages were efficient at excluding the targeted fauna. The cage also proved to be appropriate to deep-sea studies considering the fact that there was no fouling on the cages and no evidence of any organism establishing residence on or adjacent to it. Environmental changes inside the cages were dependent on the experimental period analysed. In the 4y experiment, chlorophyll a concentrations were higher in the uppermost centimeter of sediment inside cages whereas in the 2y experiment, it did not differ between inside and outside. Although the cages caused some changes to the sedimentary regime, they are relatively minor compared to similar studies in shallow water. The only parameter that was significantly higher under cages at both experiments was the concentration of phaeopigments. Since the epibenthic megafauna at our study site can potentially affect phytodetritus distribution and availability at the seafloor (e.g. via consumption, disaggregation and burial), we suggest that their exclusion was, at least in part, responsible for the increases in pigment concentrations. Cages might be suitable tools to study the long-term effects of disturbances caused by megafaunal organisms on the diversity and community structure of smaller-sized organisms in the deep sea, although further work employing partial cage controls, greater replication, and evaluating faunal components will be essential to unequivocally establish their utility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

13/01/15 Funded by •Faculty of Management at Radboud University Nijmegen

Relevância:

100.00% 100.00%

Publicador:

Resumo:

13/01/15 Funded by •Faculty of Management at Radboud University Nijmegen

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a theoretical model on the vibration analysis of micro scale fluid-loaded rectangular isotropic plates, based on the Lamb's assumption of fluid-structure interaction and the Rayleigh-Ritz energy method. An analytical solution for this model is proposed, which can be applied to most cases of boundary conditions. The dynamical experimental data of a series of microfabricated silicon plates are obtained using a base-excitation dynamic testing facility. The natural frequencies and mode shapes in the experimental results are in good agreement with the theoretical simulations for the lower order modes. The presented theoretical and experimental investigations on the vibration characteristics of the micro scale plates are of particular interest in the design of microplate based biosensing devices. Copyright © 2009 by ASME.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.

Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.

The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.

The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.

All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis was to investigate the high prevalence of Clostridium difficile in patients with cystic fibrosis (CF), and to control its dissemination. To determine the carriage rate of C. difficile in CF patients, 60 patients were tested for C. difficile and its toxin. In total, 50% of patients were found to be asymptomatic carriers of C. difficile despite toxin being detected in 31.66% of patients. Ribotyping of the C. difficile isolates revealed 16 distinct ribotypes, including the hyper virulent RT078. All isolates were sensitive to both Vancomycin and Metronidazole. The effect of CF and its treatment on the gut microbiota of CF patients was assessed by 16s sequencing of the gut microbiota of 68 CF patients. When compared to a healthy control group, CF patient gut microbiota was found to be less diverse and had an increased Firmicutes to Bacteriodetes ratio. Interestingly, CF patients who were carriers of C. difficile had a less diverse gut microbiota than C. difficile negative CF patients. Multilocus sequence typing was found to be comparable to PCR-ribotyping for typing C. difficile isolates from high risk patient groups. The sequence type ST 26 is potentially associated with CF patients as all seven isolates were found in this group and this sequence type has been previously reported in CF patients in a geographically distinct study. The bacteriophage ФCD6356 was assessed as a targeted antimicrobial against C. difficile in an ex-vivo model of the human distal colon. Despite reducing viable C. difficile by 1.75 logs over 24 hours, this bacteriophage was not suitable due to its lysogenic nature. Following treatment, all surviving C. difficile were immune to reinfection due to prophage integration. However, the ФCD6356 encoded endolysin was capable of reducing viable C. difficile by 2.9 over 2 hours in vitro after being cloned and expressed in Escherichia coli.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.

In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.

By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.

Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A continuous time series of annual soil thaw records, extending from 1994 to 2009, is available for comparison with the records of thaw obtained from the Biocomplexity Experiment (BE) for the period 2006-2009. Discontinuous records of thaw at Barrow from wet tundra sites date back to the 1960s. Comparisons between the longer records with the BE observations reveal strong similarities. Records of permafrost temperature, reflecting changes in the annual surface energy exchange, are available from the 1950s for comparison with results from measurement programs begun in 2002. The long-term systematic geocryological investigations at Barrow indicate an increase in permafrost temperature, especially during the last several years. The increase in near-surface permafrost temperature is most pronounced in winter. Marked trends are not apparent in the active-layer record, although subsidence measurements on the North Slope indicate that penetration into the ice-rich layer at the top of permafrost has occurred over the past decade. Active-layer thickness values from the 1960s are generally higher than those from the 1990s, and are very similar to those of the 2000s. Analysis of spatial active-layer observations at representative locations demonstrates significant variations in active-layer thickness between different landscape types, reflecting the influence of vegetation, substrate, microtopography, and, especially, soil moisture. Landscape-specific differences exist in the response of active-layer thickness to climatic forcing. These differences are attributable to the existence of localized controls related to combinations of surface and subsurface characteristics. The geocryological records at Barrow illustrate the importance and effectiveness of sustained, well organized monitoring efforts to document long-term trends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Purpose The purpose of the study is to review recent studies published from 2007-2015 on tourism and hotel demand modeling and forecasting with a view to identifying the emerging topics and methods studied and to pointing future research directions in the field. Design/Methodology/approach Articles on tourism and hotel demand modeling and forecasting published in both science citation index (SCI) and social science citation index (SSCI) journals were identified and analyzed. Findings This review found that the studies focused on hotel demand are relatively less than those on tourism demand. It is also observed that more and more studies have moved away from the aggregate tourism demand analysis, while disaggregate markets and niche products have attracted increasing attention. Some studies have gone beyond neoclassical economic theory to seek additional explanations of the dynamics of tourism and hotel demand, such as environmental factors, tourist online behavior and consumer confidence indicators, among others. More sophisticated techniques such as nonlinear smooth transition regression, mixed-frequency modeling technique and nonparametric singular spectrum analysis have also been introduced to this research area. Research limitations/implications The main limitation of this review is that the articles included in this study only cover the English literature. Future review of this kind should also include articles published in other languages. The review provides a useful guide for researchers who are interested in future research on tourism and hotel demand modeling and forecasting. Practical implications This review provides important suggestions and recommendations for improving the efficiency of tourism and hospitality management practices. Originality/value The value of this review is that it identifies the current trends in tourism and hotel demand modeling and forecasting research and points out future research directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports the findings from a study of the learning of English intonation by Spanish speakers within the discourse mode of L2 oral presentation. The purpose of this experiment is, firstly, to compare four prosodic parameters before and after an L2 discourse intonation training programme and, secondly, to confirm whether subjects, after the aforementioned L2 discourse intonation training, are able to match the form of these four prosodic parameters to the discourse-pragmatic function of dominance and control. The study designed the instructions and tasks to create the oral and written corpora and Brazil’s Pronunciation for Advanced Learners of English was adapted for the pedagogical aims of the present study. The learners’ pre- and post-tasks were acoustically analysed and a pre / post- questionnaire design was applied to interpret the acoustic analysis. Results indicate most of the subjects acquired a wider choice of the four prosodic parameters partly due to the prosodically-annotated transcripts that were developed throughout the L2 discourse intonation course. Conversely, qualitative and quantitative data reveal most subjects failed to match the forms to their appropriate pragmatic functions to express dominance and control in an L2 oral presentation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The Philippines has a population of approximately 103 million people, of which 6.7 million live in schistosomiasis-endemic areas with 1.8 million people being at risk of infection with Schistosoma japonicum. Although the country-wide prevalence of schistosomiasis japonica in the Philippines is relatively low, the prevalence of schistosomiasis can be high, approaching 65% in some endemic areas. Of the currently available microscopy-based diagnostic techniques for detecting schistosome infections in the Philippines and elsewhere, most exhibit varying diagnostic performances, with the Kato-Katz (KK) method having particularly poor sensitivity for detecting low intensity infections. This suggests that the actual prevalence of schistosomiasis japonica may be much higher than previous reports have indicated.

METHODOLOGY/PRINCIPAL FINDINGS: Six barangay (villages) were selected to determine the prevalence of S. japonicum in humans in the municipality of Palapag, Northern Samar. Fecal samples were collected from 560 humans and examined by the KK method and a validated real-time PCR (qPCR) assay. A high S. japonicum prevalence (90.2%) was revealed using qPCR whereas the KK method indicated a lower prevalence (22.9%). The geometric mean eggs per gram (GMEPG) determined by the qPCR was 36.5 and 11.5 by the KK. These results, particularly those obtained by the qPCR, indicate that the prevalence of schistosomiasis in this region of the Philippines is much higher than historically reported.

CONCLUSIONS/SIGNIFICANCE: Despite being more expensive, qPCR can complement the KK procedure, particularly for surveillance and monitoring of areas where extensive schistosomiasis control has led to low prevalence and intensity infections and where schistosomiasis elimination is on the horizon, as for example in southern China.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bulk gallium nitride (GaN) power semiconductor devices are gaining significant interest in recent years, creating the need for technology computer aided design (TCAD) simulation to accurately model and optimize these devices. This paper comprehensively reviews and compares different GaN physical models and model parameters in the literature, and discusses the appropriate selection of these models and parameters for TCAD simulation. 2-D drift-diffusion semi-classical simulation is carried out for 2.6 kV and 3.7 kV bulk GaN vertical PN diodes. The simulated forward current-voltage and reverse breakdown characteristics are in good agreement with the measurement data even over a wide temperature range.