894 resultados para Many-to-many-assignment problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the issue of racial discrimination of Black United States (U.S.) restaurant patrons from a service quality and customer satisfaction perspective. In spite of the progress the industry has made in recent years to alleviate this problem, many contemporary examples clearly demonstrate that racial discrimination is still of great concern. The articles stresses the importance of an ethical approach in human resource management-intensive and offers suggestions for reducing discriminatory practices in U.S. restaurant service delivery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the discussion - The Nevada Gaming Debt Collection Experience - by Larry D. Strate, Assistant Professor, College of Business and Economics at the University of Nevada, Las Vegas, Assistant Professor Strate initially outlines the article by saying: “Even though Nevada has had over a century of legalized gaming experience, the evolution of gaming debt collection has been a recent phenomenon. The author traces that history and discusses implications of the current law.” The discussion opens with a comparison between the gaming industries of New Jersey/Atlantic City, and Las Vegas, Nevada. This contrast serves to point out the disparities in debt handling between the two. “There are major differences in the development of legalized gaming for both Nevada and Atlantic City. Nevada has had over a century of legalized gambling; Atlantic City, New Jersey, has completed a decade of its operation,” Strate informs you. “Nevada's gaming industry has been its primary economic base for many years; Atlantic City's entry into gaming served as a possible solution to a social problem. Nevada's processes of legalized gaming, credit play, and the collection of gaming debts were developed over a period of 125 years; Atlantic City's new industry began with gaming, gaming credit, and gaming debt collection simultaneously in 1976 [via the New Jersey Casino Control Act] .” The irony here is that Atlantic City, being the younger venue, had or has a better system for handling debt collection than do the historic and traditional Las Vegas properties. Many of these properties were duplicated in New Jersey, so the dichotomy existed whereby New Jersey casinos could recoup debt while their Nevada counterparts could not. “It would seem logical that a "territory" which permitted gambling in the early 1800’s would have allowed the Nevada industry to collect its debts as any other legal enterprise. But it did not,” Strate says. Of course, this situation could not be allowed to continue and Strate outlines the evolution. New Jersey tactfully benefitted from Nevada’s experience. “The fundamental change in gaming debt collection came through the legislature as the judicial decisions had declared gaming debts uncollectable by either a patron or a casino,” Strate informs you. “Nevada enacted its gaming debt collection act in 1983, six years after New Jersey,” Strate points out. One of the most noteworthy paragraphs in the entire article is this: “The fundamental change in 1983, and probably the most significant change in the history of gaming in Nevada since the enactment of the Open Gaming Law of 1931, was to allow non-restricted gaming licensees* to recover gaming debts evidenced by a credit instrument. The new law incorporated previously litigated terms with a new one, credit instrument.” The term is legally definable and gives Nevada courts an avenue of due process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical therapy students must apply the relevant information learned in their academic and clinical experience to problem solve in treating patients. I compared the clinical cognitive competence in patient care of second-year masters students enrolled in two different curricular programs: modified problem-based (M P-B; n = 27) and subject-centered (S-C; n = 41). Main features of S-C learning include lecture and demonstration as the major teaching strategies and no exposure to patients or problem solving learning until the sciences (knowledge) have been taught. Comparatively, main features of M P-B learning include case study in small student groups as the main teaching strategy, early and frequent exposure to patients, and knowledge and problem solving skills learned together for each specific case. Basic and clinical orthopedic knowledge was measured with a written test with open-ended items. Problem solving skills were measured with a written case study patient problem test yielding three subscores: assessment, problem identification, and treatment planning. ^ Results indicated that among the demographic and educational characteristics analyzed, there was a significant difference between groups on ethnicity, bachelor degree type, admission GPA, and current GPA, but there was no significant difference on gender, age, possession of a physical therapy assistant license, and GRE score. In addition, the M P-B group achieved a significantly higher adjusted mean score on the orthopedic knowledge test after controlling for GRE scores. The S-C group achieved a significantly higher adjusted mean total score and treatment management subscore on the case study test after controlling for orthopedic knowledge test scores. These findings did not support their respective research hypotheses. There was no significant difference between groups on the assessment and problem identification subscores of the case study test. The integrated M P-B approach promoted superior retention of basic and clinical science knowledge. The results on problem solving skills were mixed. The S-C approach facilitated superior treatment planning skills, but equivalent patient assessment and problem identification skills by emphasizing all equally and exposing the students to more patients with a wider variety of orthopedic physical therapy needs than in the M P-B approach. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supervisory Control & Data Acquisition (SCADA) systems are used by many industries because of their ability to manage sensors and control external hardware. The problem with commercially available systems is that they are restricted to a local network of users that use proprietary software. There was no Internet development guide to give remote users out of the network, control and access to SCADA data and external hardware through simple user interfaces. To solve this problem a server/client paradigm was implemented to make SCADAs available via the Internet. Two methods were applied and studied: polling of a text file as a low-end technology solution and implementing a Transmission Control Protocol (TCP/IP) socket connection. Users were allowed to login to a website and control remotely a network of pumps and valves interfaced to a SCADA. This enabled them to sample the water quality of different reservoir wells. The results were based on real time performance, stability and ease of use of the remote interface and its programming. These indicated that the most feasible server to implement is the TCP/IP connection. For the user interface, Java applets and Active X controls provide the same real time access.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to determine whether public space in the suburbs has the same settings as that of a central city or if it has its own characteristics. In order to approach this problem the area of Kendall was thoroughly studied by examining aerial maps, historic images and writings of local historians such as Donna Knowles Born. Heavy emphasis was placed on the transformation of the original one-mile grid characteristic of the city of Miami. As the area of Kendall was being developed, the grid was transformed into an irregular and organic method of laying out a street system that directly affected pedestrian life. It became evident, therefore, that Kendall is primarily geared toward automobile movement, thus affecting the setting of public space. This also restricted social events forcing them to concentrate in specific places like the malls. These findings demonstrated that malls are centers of social interaction concentrating many social activities in one place. In other words, a mall serves as a common meeting place in the otherwise vast spread of the suburbs. This thesis also explains how public spaces in a suburban context can affect the community by working as filtering agents between the immediate context of a particular site and the overall city. The project, a "Wellness Center and Park" for the Kendall area, was an exploration of these filtering agents and the transitions they engendered. The research upon which this project was based recognized the important role of the site's history as well as extrapolating as to its future potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growing need for food is something that worries the world, which has a population that is growing at a geometric progression while their resources grows at an arithmetic progression. To alleviate this problem there are some proposals, including increased food production or reduce waste thereof. Many studies have been conducted in the world in order to reduce food waste that can reach 40% of production, depending on the region. For this purpose techniques are used to retard degradation of foods, including drying. This paper presents a design of a hybrid fruit dryer that uses solar energy and electric energy with automation of the process. To accomplish drying tests were chosen Typical fruits with good acceptability as processed fruits. During the experiments were measured temperature values at different points. Were also measured humidity values, solar radiation and mass. A data acquisition system was built using a Arduino for obtaining temperatures. The data were sent to a program named Secador de Frutas, done in this work, to plot the same. The volume of the drying chamber was 423 liters and despite the unusual size test using mirrors to increase the incidence of direct radiation, showed that the drier is competitive when compared with other solar dryers produced in Hydraulic Machines and Solar Energy Laboratory (LMHES ) UFRN. The drier has been built at a cost of 3 to 5 times smaller than industrial dryers that operate with the same load of fruit. And the energy cost to produce dried fruits was more feasible compared with such dryers that use LPG as an energy source. However, the drying time was longer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cutting fluids are lubricants used in machining processes, because they present many benefits for different processes. They have many functions, such as lubrication, cooling, improvement in surface finishing, besides they decreases the tool wear and protect it against corrosion. Therefore due to new environment laws and demand to green products, new cutting fluids must be development. These shall be biodegradable, non-toxic, safety for environment and operator healthy. Thus, vegetable oils are a good option to solve this problem, replacing the mineral oils. In this context, this work aimed to develop an emulsion cutting fluid from epoxidized vegetable oil, promoting better lubrication and cooling in machining processes, besides being environment friendly. The methodology was divided in five steps: first one was the biolubricant synthesis by epoxidation reaction. Following this, the biolubricant was characterized in terms of density, acidity, iodo index, oxirane index, viscosity, thermal stability and chemical composition. The third step was to develop an emulsion O/A with different oil concentration (10, 20 and 25%) and surfactant concentration (1, 2.5 and 5%). Also, emulsion stability was studied. The emulsion tribological performance were carried out in HFRR (High Frequency Reciprocating Rig), it consists in ball-disc contact. Results showed that the vegetable based lubricant may be synthesized by epoxidationreaction, the spectra showed that there was 100% conversion of the epoxy rings unsaturations. In regard the tribological assessment is observed that the percentage of oil present in the emulsion directly influenced the film formation and coefficient of friction for higher concentrations the film formation process is slow and unstable, and the coefficient of friction. The high concentrations of surfactants have not improved the emulsions tribological performance. The best performance in friction reduction was observed to emulsion with 10% of oil and 5% of surfactant, its average wear scar was 202 μm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In longitudinal data analysis, our primary interest is in the regression parameters for the marginal expectations of the longitudinal responses; the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly for correlated discrete outcome data. Marginal modeling approaches such as generalized estimating equations (GEEs) have received much attention in the context of longitudinal regression. These methods are based on the estimates of the first two moments of the data and the working correlation structure. The confidence regions and hypothesis tests are based on the asymptotic normality. The methods are sensitive to misspecification of the variance function and the working correlation structure. Because of such misspecifications, the estimates can be inefficient and inconsistent, and inference may give incorrect results. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its characteristics and asymptotic properties. We also provide an algorithm based on EL principles for the estimation of the regression parameters and the construction of a confidence region for the parameter of interest. We extend our approach to variable selection for highdimensional longitudinal data with many covariates. In this situation it is necessary to identify a submodel that adequately represents the data. Including redundant variables may impact the model’s accuracy and efficiency for inference. We propose a penalized empirical likelihood (PEL) variable selection based on GEEs; the variable selection and the estimation of the coefficients are carried out simultaneously. We discuss its characteristics and asymptotic properties, and present an algorithm for optimizing PEL. Simulation studies show that when the model assumptions are correct, our method performs as well as existing methods, and when the model is misspecified, it has clear advantages. We have applied the method to two case examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce mémoire explore la relation qui lie démocratie et légitimité politique, dans une perspective épistémique. La démocratie, dans son acception la plus générale, confère à chacun la possibilité de faire valoir les intérêts qu'il estime être les siens et ceux de sa communauté, en particulier à l’occasion d’un scrutin. Cette procédure décisionnelle qu’est le vote consacre ainsi en quelque sorte la liberté et l’égalité dont profitent chacun des citoyens, et confère une certaine légitimité au processus décisionnel. Cela dit, si le vote n’est pas encadré par des considérations épistémiques, rien ne garantit que le résultat politique qui en découlera sera souhaitable tant pour les individus que pour la collectivité: il est tout à fait permis d’imaginer que des politiques discriminatoires, économiquement néfastes ou simplement inefficaces voient ainsi le jour, et prennent effet au détriment de tous. En réponse à ce problème, différentes théories démocratiques ont vu le jour et se sont succédé, afin de tenter de lier davantage le processus démocratique à l’atteinte d’objectifs politiques bénéfiques pour la collectivité. Au nombre d’entre elles, la démocratie délibérative a proposé de substituer la seule confrontation d’intérêts de la démocratie agrégative par une recherche collective du bien commun, canalisée autour de procédures délibératives appelées à légitimer sur des bases plus solides l’exercice démocratique. À sa suite, la démocratie épistémique s’est inspirée des instances délibératives en mettant davantage l’accent sur la qualité des résultats obtenus que sur les procédures elles-mêmes. Au final, un même dilemme hante chaque fois les différentes théories : est-il préférable de construire les instances décisionnelles en se concentrant prioritairement sur les critères procéduraux eux-mêmes, au risque de voir de mauvaises décisions filtrer malgré tout au travers du processus sans pouvoir rien y faire, ou devons-nous avoir d’entrée de jeu une conception plus substantielle de ce qui constitue une bonne décision, au risque cette fois de sacrifier la liberté de choix qui est supposé caractériser un régime démocratique? La thèse que nous défendrons dans ce mémoire est que le concept d’égalité politique peut servir à dénouer ce dilemme, en prenant aussi bien la forme d’un critère procédural que celle d’un objectif politique préétabli. L’égalité politique devient en ce sens une source normative forte de légitimité politique. En nous appuyant sur le procéduralisme épistémique de David Estlund, nous espérons avoir démontré au terme de ce mémoire que l’atteinte d’une égalité politique substantielle par le moyen de procédures égalitaires n’est pas une tautologie hermétique, mais plutôt un mécanisme réflexif améliorant tantôt la robustesse des procédures décisionnelles, tantôt l’atteinte d’une égalité tangible dans les rapports entre citoyens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using the wisdom of crowds---combining many individual forecasts to obtain an aggregate estimate---can be an effective technique for improving forecast accuracy. When individual forecasts are drawn from independent and identical information sources, a simple average provides the optimal crowd forecast. However, correlated forecast errors greatly limit the ability of the wisdom of crowds to recover the truth. In practice, this dependence often emerges because information is shared: forecasters may to a large extent draw on the same data when formulating their responses.

To address this problem, I propose an elicitation procedure in which each respondent is asked to provide both their own best forecast and a guess of the average forecast that will be given by all other respondents. I study optimal responses in a stylized information setting and develop an aggregation method, called pivoting, which separates individual forecasts into shared and private information and then recombines these results in the optimal manner. I develop a tailored pivoting procedure for each of three information models, and introduce a simple and robust variant that outperforms the simple average across a variety of settings.

In three experiments, I investigate the method and the accuracy of the crowd forecasts. In the first study, I vary the shared and private information in a controlled environment, while the latter two studies examine forecasts in real-world contexts. Overall, the data suggest that a simple minimal pivoting procedure provides an effective aggregation technique that can significantly outperform the crowd average.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Development of reliable methods for optimised energy storage and generation is one of the most imminent challenges in modern power systems. In this paper an adaptive approach to load leveling problem using novel dynamic models based on the Volterra integral equations of the first kind with piecewise continuous kernels. These integral equations efficiently solve such inverse problem taking into account both the time dependent efficiencies and the availability of generation/storage of each energy storage technology. In this analysis a direct numerical method is employed to find the least-cost dispatch of available storages. The proposed collocation type numerical method has second order accuracy and enjoys self-regularization properties, which is associated with confidence levels of system demand. This adaptive approach is suitable for energy storage optimisation in real time. The efficiency of the proposed methodology is demonstrated on the Single Electricity Market of Republic of Ireland and Northern Ireland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stovepipes, or also called silos, appear in many different organizations and sectors and contribute to problems when employees or managers tend to look more to their own, or the individual departments, objectives rather than to the organizations. The purpose of this study was to identify different communicative factors that promote stovepipes in order to further identify the most critical factor to disarm. A case study has been done at a selected company, with a stovepipe structure, in order to achieve the purpose of the study. The case study has included interviews and observations to identify different problem areas which then have been compared with three communicative factors identified in previous studies. The factor that had the most connections to the problem areas have been considered the most critical factor. The result of the study indicates that “A lack of understanding each other's work” is the most critical factor in stovepipe structures and that it can be prevented by following five recommendations: bring up positive collaboration continually, raise problems with each other instead of with others, identify different communication paths in and between the departments, implement a long-term model for preventing stovepipes and set up workshops between the involved departments. The conclusion of the study is that stovepipes create several undesirable effects in the organization but that the efforts to counter these problems do not have to be complicated. Following five small steps into a better collaboration and communication can be enough to be on your way to a better organizational structure.