20 resultados para popularity

em Digital Commons at Florida International University


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past one hundred years, interscholastic athletic programs have evolved to a place of prominence in both public and private education across America. The National Federation of State High School Associations (NFHS) estimates that approximately 3.96 million males and 2.80 million females participated in organized high school athletic programs during the 2001–2002 school year at over 17 thousand public and private high schools. The popularity of interscholastic athletic programs has resulted in continuous investigations of the relationship between high school athletic programs and academic performance. ^ The present study extends earlier investigations by examining the relation of athletic participation to several indicators of academic performance for senior high school students. This research examined: (a) average daily attendance of varsity athletes and non-athletes; (b) final cumulative grade point average; and (c) test scores on the tenth grade Florida Comprehensive Assessment Test (FLAT) in both reading and in mathematics. ^ Data were collected on 2081 randomly selected male and female high school students identified as athletes or non-athletes at ten public senior high schools in the Miami-Dade County Public Schools district. The results of the overall analyses showed a positive and significant relationship between athletic participation and educational performance. On average, athletes were absent fewer days from school per year than non-athletes and athletes earned a significantly higher cumulative grade point average than their non-athlete peers. A significant statistical difference was also found in the tenth grade FCAT test scores in both reading and mathematics for athletes and non-athletes when eighth grade FCAT test scores in reading and mathematics were used as co-variates. Athletes earned significantly higher Grade 10 FCAT test scores in both reading and mathematics than non-athletes. ^ Although cause and effect cannot be inferred from this study, the findings do indicate the potentially beneficial value of athletic programs in public secondary education. The study concluded that Florida high school graduation requirements might seriously consider the role of interscholastic athletic programs as a valid and essential extra-curricular activity. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advantages and popularity of Permanent Magnet (PM) motors due to their high power density, there is an increasing incentive to use them in variety of applications including electric actuation. These applications have strict noise emission standards. The generation of audible noise and associated vibration modes are characteristics of all electric motors, it is especially problematic in low speed sensorless control rotary actuation applications using high frequency voltage injection technique. This dissertation is aimed at solving the problem of optimizing the sensorless control algorithm for low noise and vibration while achieving at least 12 bit absolute accuracy for speed and position control. The low speed sensorless algorithm is simulated using an improved Phase Variable Model, developed and implemented in a hardware-in-the-loop prototyping environment. Two experimental testbeds were developed and built to test and verify the algorithm in real time.^ A neural network based modeling approach was used to predict the audible noise due to the high frequency injected carrier signal. This model was created based on noise measurements in an especially built chamber. The developed noise model is then integrated into the high frequency based sensorless control scheme so that appropriate tradeoffs and mitigation techniques can be devised. This will improve the position estimation and control performance while keeping the noise below a certain level. Genetic algorithms were used for including the noise optimization parameters into the developed control algorithm.^ A novel wavelet based filtering approach was proposed in this dissertation for the sensorless control algorithm at low speed. This novel filter was capable of extracting the position information at low values of injection voltage where conventional filters fail. This filtering approach can be used in practice to reduce the injected voltage in sensorless control algorithm resulting in significant reduction of noise and vibration.^ Online optimization of sensorless position estimation algorithm was performed to reduce vibration and to improve the position estimation performance. The results obtained are important and represent original contributions that can be helpful in choosing optimal parameters for sensorless control algorithm in many practical applications.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since multimedia data, such as images and videos, are way more expressive and informative than ordinary text-based data, people find it more attractive to communicate and express with them. Additionally, with the rising popularity of social networking tools such as Facebook and Twitter, multimedia information retrieval can no longer be considered a solitary task. Rather, people constantly collaborate with one another while searching and retrieving information. But the very cause of the popularity of multimedia data, the huge and different types of information a single data object can carry, makes their management a challenging task. Multimedia data is commonly represented as multidimensional feature vectors and carry high-level semantic information. These two characteristics make them very different from traditional alpha-numeric data. Thus, to try to manage them with frameworks and rationales designed for primitive alpha-numeric data, will be inefficient. An index structure is the backbone of any database management system. It has been seen that index structures present in existing relational database management frameworks cannot handle multimedia data effectively. Thus, in this dissertation, a generalized multidimensional index structure is proposed which accommodates the atypical multidimensional representation and the semantic information carried by different multimedia data seamlessly from within one single framework. Additionally, the dissertation investigates the evolving relationships among multimedia data in a collaborative environment and how such information can help to customize the design of the proposed index structure, when it is used to manage multimedia data in a shared environment. Extensive experiments were conducted to present the usability and better performance of the proposed framework over current state-of-art approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The number of dividend paying firms has been on the decline since the popularity of stock repurchases in the 1980s, and the recent financial crisis has brought about a wave of dividend reductions and omissions. This dissertation examined the U.S. firms and American Depository Receipts that are listed on the U.S. equity exchanges according to their dividend paying history in the previous twelve quarters. While accounting for the state of the economy, the firm’s size, profitability, earned equity, and growth opportunities, it determines whether or not the firm will pay a dividend in the next quarter. It also examined the likelihood of a dividend change. Further, returns of firms were examined according to their dividend paying history and the state of the economy using the Fama-French three-factor model. Using forward, backward, and step-wise selection logistic regressions, the results show that firms with a history of regular and uninterrupted dividend payments are likely to continue to pay dividends, while firms that do not have a history of regular dividend payments are not likely to begin to pay dividends or continue to do so. The results of a set of generalized polytomous logistic regressions imply that dividend paying firms are more likely to reduce dividend payments during economic expansions, as opposed to recessions. Also the analysis of returns using the Fama-French three factor model reveals that dividend paying firms are earning significant abnormal positive returns. As a special case, a similar analysis of dividend payment and dividend change was applied to American Depository Receipts that trade on the NYSE, NASDAQ, and AMEX exchanges and are issued by the Bank of New York Mellon. Returns of American Depository Receipts were examined using the Fama-French two-factor model for international firms. The results of the generalized polytomous logistic regression analyses indicate that dividend paying status and economic conditions are also important for dividend level change of American Depository Receipts, and Fama-French two-factor regressions alone do not adequately explain returns for these securities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the last three decades, the Capital Asset Pricing Model (CAPM) has been a dominant model to calculate expected return. In early 1990% Fama and French (1992) developed the Fama and French Three Factor model by adding two additional factors to the CAPM. However even with these present models, it has been found that estimates of the expected return are not accurate (Elton, 1999; Fama &French, 1997). Botosan (1997) introduced a new approach to estimate the expected return. This approach employs an equity valuation model to calculate the internal rate of return (IRR) which is often called, 'implied cost of equity capital" as a proxy of the expected return. This approach has been gaining in popularity among researchers. A critical review of the literature will help inform hospitality researchers regarding the issue and encourage them to implement the new approach into their own studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Menu engineering is a methodology to classify menu items by their contribution margin and popularity. The process discounts the importance of food cost percentage, recognizing that operators deposit cash, not percentages. The authors raise the issue that strict application of the principles of menu engineering may result in an erroneous evaluation of a menu item, and also may be of little use without considering the variable portion of labor. They describe an enhancement to the process by considering labor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even though the popularity and usage of teleconferencing is evident primarily outside the lodging industry, lodging operators cannot choose to ifnore the role teleconferencing will play in meeting the changing needs of guests. The authors discuss the factors that spurred the growth of teleconferencing, the opportunities and threats faced by lodging operators, and suggestions for taking advantage of the technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the article - Menu Analysis: Review and Evaluation - by Lendal H. Kotschevar, Distinguished Professor School of Hospitality Management, Florida International University, Kotschevar’s initial statement reads: “Various methods are used to evaluate menus. Some have quite different approaches and give different information. Even those using quite similar methods vary in the information they give. The author attempts to describe the most frequently used methods and to indicate their value. A correlation calculation is made to see how well certain of these methods agree in the information they give.” There is more than one way to look at the word menu. The culinary selections decided upon by the head chef or owner of a restaurant, which ultimately define the type of restaurant is one way. The physical outline of the food, which a patron actually holds in his or her hand, is another. These descriptions are most common to the word, menu. The author primarily concentrates on the latter description, and uses the act of counting the number of items sold on a menu to measure the popularity of any particular item. This, along with a formula, allows Kotschevar to arrive at a specific value per item. Menu analysis would appear a difficult subject to broach. How does a person approach a menu analysis, how do you qualify and quantify a menu; it seems such a subjective exercise. The author offers methods and outlines on approaching menu analysis from empirical perspectives. “Menus are often examined visually through the evaluation of various factors. It is a subjective method but has the advantage of allowing scrutiny of a wide range of factors which other methods do not,” says Distinguished Professor, Kotschevar. “The method is also highly flexible. Factors can be given a score value and scores summed to give a total for a menu. This allows comparison between menus. If the one making the evaluations knows menu values, it is a good method of judgment,” he further offers. The author wants you to know that assigning values is fundamental to a pragmatic menu analysis; it is how the reviewer keeps score, so to speak. Value merit provides reliable criteria from which to gauge a particular menu item. In the final analysis, menu evaluation provides the mechanism for either keeping or rejecting selected items on a menu. Kotschevar provides at least three different matrix evaluation methods; they are defined as the Miller method, the Smith and Kasavana method, and the Pavesic method. He offers illustrated examples of each via a table format. These are helpful tools since trying to explain the theories behind the tables would be difficult at best. Kotschevar also references examples of analysis methods which aren’t matrix based. The Hayes and Huffman - Goal Value Analysis - is one such method. The author sees no one method better than another, and suggests that combining two or more of the methods to be a benefit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the discussion - Indirect Cost Factors in Menu Pricing – by David V. Pavesic, Associate Professor, Hotel, Restaurant and Travel Administration at Georgia State University, Associate Professor Pavesic initially states: “Rational pricing methodologies have traditionally employed quantitative factors to mark up food and beverage or food and labor because these costs can be isolated and allocated to specific menu items. There are, however, a number of indirect costs that can influence the price charged because they provide added value to the customer or are affected by supply/demand factors. The author discusses these costs and factors that must be taken into account in pricing decisions. Professor Pavesic offers as a given that menu pricing should cover costs, return a profit, reflect a value for the customer, and in the long run, attract customers and market the establishment. “Prices that are too high will drive customers away, and prices that are too low will sacrifice profit,” Professor Pavesic puts it succinctly. To dovetail with this premise the author provides that although food costs measure markedly into menu pricing, other factors such as equipment utilization, popularity/demand, and marketing are but a few of the parenthetic factors also to be considered. “… there is no single method that can be used to mark up every item on any given restaurant menu. One must employ a combination of methodologies and theories,” says Professor Pavesic. “Therefore, when properly carried out, prices will reflect food cost percentages, individual and/or weighted contribution margins, price points, and desired check averages, as well as factors driven by intuition, competition, and demand.” Additionally, Professor Pavesic wants you to know that value, as opposed to maximizing revenue, should be a primary motivating factor when designing menu pricing. This philosophy does come with certain caveats, and he explains them to you. Generically speaking, Professor Pavesic says, “The market ultimately determines the price one can charge.” But, in fine-tuning that decree he further offers, “Lower prices do not automatically translate into value and bargain in the minds of the customers. Having the lowest prices in your market may not bring customers or profit. “Too often operators engage in price wars through discount promotions and find that profits fall and their image in the marketplace is lowered,” Professor Pavesic warns. In reference to intangibles that influence menu pricing, service is at the top of the list. Ambience, location, amenities, product [i.e. food] presentation, and price elasticity are discussed as well. Be aware of price-value perception; Professor Pavesic explains this concept to you. Professor Pavesic closes with a brief overview of a la carte pricing; its pros and cons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When the author wrote her first article for the FIU Hospitality Review on leveraged buyouts' some five years ago, this business strategy was beginning to enjoy increasing popularity. Since that time leveraged buyouts grew to unprecedented levels both in number and size of transactions. However, following the failure of the UAL proposal and the collapse of the junk bond market in 1989, there has been a marked slowdown in buyout activity this article examines major developments affecting leveraged buyouts over the past five years and addresses their future implications for the hospitality industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electronic database handling of buisness information has gradually gained its popularity in the hospitality industry. This article provides an overview on the fundamental concepts of a hotel database and investigates the feasibility of incorporating computer-assisted data mining techniques into hospitality database applications. The author also exposes some potential myths associated with data mining in hospitaltiy database applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past five years, XML has been embraced by both the research and industrial community due to its promising prospects as a new data representation and exchange format on the Internet. The widespread popularity of XML creates an increasing need to store XML data in persistent storage systems and to enable sophisticated XML queries over the data. The currently available approaches to addressing the XML storage and retrieval issue have the limitations of either being not mature enough (e.g. native approaches) or causing inflexibility, a lot of fragmentation and excessive join operations (e.g. non-native approaches such as the relational database approach). ^ In this dissertation, I studied the issue of storing and retrieving XML data using the Semantic Binary Object-Oriented Database System (Sem-ODB) to leverage the advanced Sem-ODB technology with the emerging XML data model. First, a meta-schema based approach was implemented to address the data model mismatch issue that is inherent in the non-native approaches. The meta-schema based approach captures the meta-data of both Document Type Definitions (DTDs) and Sem-ODB Semantic Schemas, thus enables a dynamic and flexible mapping scheme. Second, a formal framework was presented to ensure precise and concise mappings. In this framework, both schemas and the conversions between them are formally defined and described. Third, after major features of an XML query language, XQuery, were analyzed, a high-level XQuery to Semantic SQL (Sem-SQL) query translation scheme was described. This translation scheme takes advantage of the navigation-oriented query paradigm of the Sem-SQL, thus avoids the excessive join problem of relational approaches. Finally, the modeling capability of the Semantic Binary Object-Oriented Data Model (Sem-ODM) was explored from the perspective of conceptually modeling an XML Schema using a Semantic Schema. ^ It was revealed that the advanced features of the Sem-ODB, such as multi-valued attributes, surrogates, the navigation-oriented query paradigm, among others, are indeed beneficial in coping with the XML storage and retrieval issue using a non-XML approach. Furthermore, extensions to the Sem-ODB to make it work more effectively with XML data were also proposed. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The deployment of wireless communications coupled with the popularity of portable devices has led to significant research in the area of mobile data caching. Prior research has focused on the development of solutions that allow applications to run in wireless environments using proxy based techniques. Most of these approaches are semantic based and do not provide adequate support for representing the context of a user (i.e., the interpreted human intention.). Although the context may be treated implicitly it is still crucial to data management. In order to address this challenge this dissertation focuses on two characteristics: how to predict (i) the future location of the user and (ii) locations of the fetched data where the queried data item has valid answers. Using this approach, more complete information about the dynamics of an application environment is maintained. ^ The contribution of this dissertation is a novel data caching mechanism for pervasive computing environments that can adapt dynamically to a mobile user's context. In this dissertation, we design and develop a conceptual model and context aware protocols for wireless data caching management. Our replacement policy uses the validity of the data fetched from the server and the neighboring locations to decide which of the cache entries is less likely to be needed in the future, and therefore a good candidate for eviction when cache space is needed. The context aware driven prefetching algorithm exploits the query context to effectively guide the prefetching process. The query context is defined using a mobile user's movement pattern and requested information context. Numerical results and simulations show that the proposed prefetching and replacement policies significantly outperform conventional ones. ^ Anticipated applications of these solutions include biomedical engineering, tele-health, medical information systems and business. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Sucralose has gained popularity as a low calorie artificial sweetener worldwide. Due to its high stability and persistence, sucralose has shown widespread occurrence in environmental waters, at concentrations that could reach up to several μg/L. Previous studies have used time consuming sample preparation methods (offline solid phase extraction/derivatization) or methods with rather high detection limits (direct injection) for sucralose analysis. This study described a faster and sensitive analytical method for the determination of sucralose in environmental samples. Results An online SPE-LC–MS/MS method was developed, being capable to quantify sucralose in 12 minutes using only 10 mL of sample, with method detection limits (MDLs) of 4.5 ng/L, 8.5 ng/L and 45 ng/L for deionized water, drinking and reclaimed waters (1:10 diluted with deionized water), respectively. Sucralose was detected in 82% of the reclaimed water samples at concentrations reaching up to 18 μg/L. The monthly average for a period of one year was 9.1 ± 2.9 μg/L. The calculated mass loads per capita of sucralose discharged through WWTP effluents based on the concentrations detected in wastewaters in the U. S. is 5.0 mg/day/person. As expected, the concentrations observed in drinking water were much lower but still relevant reaching as high as 465 ng/L. In order to evaluate the stability of sucralose, photodegradation experiments were performed in natural waters. Significant photodegradation of sucralose was observed only in freshwater at 254 nm. Minimal degradation (<20%) was observed for all matrices under more natural conditions (350 nm or solar simulator). The only photolysis product of sucralose identified by high resolution mass spectrometry was a de-chlorinated molecule at m/z 362.0535, with molecular formula C12H20Cl2O8. Conclusions Online SPE LC-APCI/MS/MS developed in the study was applied to more than 100 environmental samples. Sucralose was frequently detected (>80%) indicating that the conventional treatment process employed in the sewage treatment plants is not efficient for its removal. Detection of sucralose in drinking waters suggests potential contamination of surface and ground waters sources with anthropogenic wastewater streams. Its high resistance to photodegradation, minimal sorption and high solubility indicate that sucralose could be a good tracer of anthropogenic wastewater intrusion into the environment.