880 resultados para analysis of performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research examined the factors contributing to the performance of online grocers prior to, and following, the 2000 dot.com collapse. The primary goals were to assess the relationship between a company’s business model(s) and its performance in the online grocery channel and to determine if there were other company and/or market related factors that could account for company performance. To assess the primary goals, a case based theory building process was utilized. A three-way cross-case analysis comprising Peapod, GroceryWorks, and Tesco examined the common profit components, the structural category (e.g., pure-play, partnership, and hybrid) profit components, and the idiosyncratic profit components related to each specific company. Based on the analysis, it was determined that online grocery store business models could be represented at three distinct, but hierarchically, related levels. The first level was termed the core model and represented the basic profit structure that all online grocers needed in order to conduct operations. The next model level was termed the structural model and represented the profit structure associated with the specific business model configuration (i.e., pure-play, partnership, hybrid). The last model level was termed the augmented model and represented the company’s business model when idiosyncratic profit components were included. In relation to the five company related factors, scalability, rate of expansion, and the automation level were potential candidates for helping to explain online grocer performance. In addition, all the market structure related factors were deemed possible candidates for helping to explain online grocer performance. The study concluded by positing an alternative hypothesis concerning the performance of online grocers. Prior to this study, the prevailing wisdom was that the business models were the primary cause of online grocer performance. However, based on the core model analysis, it was hypothesized that the customer relationship activities (i.e., advertising, promotions, and loyalty program tie-ins) were the real drivers of online grocer performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an extensive photometric catalog for 548 CALIFA galaxies observed as of the summer of 2015. CALIFA is currently lacking photometry matching the scale and diversity of its spectroscopy; this work is intended to meet all photometric needs for CALIFA galaxies while also identifying best photometric practices for upcoming integral field spectroscopy surveys such as SAMI and MaNGA. This catalog comprises gri surface brightness profiles derived from Sloan Digital Sky Survey (SDSS) imaging, a variety of non-parametric quantities extracted from these pro files, and parametric models fitted to the i-band pro files (1D) and original galaxy images (2D). To compliment our photometric analysis, we contrast the relative performance of our 1D and 2D modelling approaches. The ability of each measurement to characterize the global properties of galaxies is quantitatively assessed, in the context of constructing the tightest scaling relations. Where possible, we compare our photometry with existing photometrically or spectroscopically obtained measurements from the literature. Close agreement is found with Walcher et al. (2014), the current source of basic photometry and classifications of CALIFA galaxies, while comparisons with spectroscopically derived quantities reveals the effect of CALIFA's limited field of view compared to broadband imaging surveys such as the SDSS. The colour-magnitude diagram, star formation main sequence, and Tully-Fisher relation of CALIFA galaxies are studied, to give a small example of the investigations possible with this rich catalog. We conclude with a discussion of points of concern for ongoing integral field spectroscopy surveys and directions for future expansion and exploitation of this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internet and the Web have changed the way that companies communicate with their publics, improving relations between them. Also providing substantial benefits for organizations. This has led to small and medium enterprises (SMEs) to develop corporate sites to establish relationships with their audiences. This paper, applying the methodology of content analysis, analyzes the main factors and tools that make the Websites usable and intuitive sites that promote better relations between SMEs and their audiences. Also, it has developed an index to measure the effectiveness of Webs from the perspective of usability. The results indicate that the Websites have, in general, appropriate levels of usability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concrete solar collectors offer a type of solar collector with structural, aesthetic and economic advantages over current populartechnologies. This study examines the influential parameters of concrete solar collectors. In addition to the external conditions,the performance of a concrete solar collector is influenced by the thermal properties of the concrete matrix, piping network andfluid. Geometric and fluid flow parameters also influence the performance of the concrete solar collector. A literature review ofconcrete solar collectors is conducted in order to define the benchmark parameters from which individual parameters are thencompared. The numerical model consists of a 1D pipe flow network coupled with the heat transfer in a 3D concrete domain. Thispaper is concerned with the physical parameters that define the concrete solar collector, thus a constant surface temperature isused as the exposed surface boundary condition with all other surfaces being insulated. Results show that, of the parametersinvestigated, the pipe spacing, ps, concrete conductivity, kc, and the pipe embedment depth, demb, are among those parameterswhich have greatest effect on the collector’s performance. The optimum balance between these parameters is presented withrespect to the thermal performance and discussed with reference to practical development issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data mining can be defined as the extraction of implicit, previously un-known, and potentially useful information from data. Numerous re-searchers have been developing security technology and exploring new methods to detect cyber-attacks with the DARPA 1998 dataset for Intrusion Detection and the modified versions of this dataset KDDCup99 and NSL-KDD, but until now no one have examined the performance of the Top 10 data mining algorithms selected by experts in data mining. The compared classification learning algorithms in this thesis are: C4.5, CART, k-NN and Naïve Bayes. The performance of these algorithms are compared with accuracy, error rate and average cost on modified versions of NSL-KDD train and test dataset where the instances are classified into normal and four cyber-attack categories: DoS, Probing, R2L and U2R. Additionally the most important features to detect cyber-attacks in all categories and in each category are evaluated with Weka’s Attribute Evaluator and ranked according to Information Gain. The results show that the classification algorithm with best performance on the dataset is the k-NN algorithm. The most important features to detect cyber-attacks are basic features such as the number of seconds of a network connection, the protocol used for the connection, the network service used, normal or error status of the connection and the number of data bytes sent. The most important features to detect DoS, Probing and R2L attacks are basic features and the least important features are content features. Unlike U2R attacks, where the content features are the most important features to detect attacks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The objective of this exploratory study is to investigate the “flow-through” or relationship between top-line measures of hotel operating performance (occupancy, average daily rate and revenue per available room) and bottom-line measures of profitability (gross operating profit and net operating income), before and during the recent great recession. Design/methodology/approach – This study uses data provided by PKF Hospitality Research for the period from 2007-2009. A total of 714 hotels were analyzed and various top-line and bottom-line profitability changes were computed using both absolute levels and percentages. Multiple regression analysis was used to examine the relationship between top and bottom line measures, and to derive flow-through ratios. Findings – The results show that average daily rate (ADR) and occupancy are significantly and positively related to gross operating profit per available room (GOPPAR) and net operating income per available room (NOIPAR). The evidence indicates that ADR, rather than occupancy, appears to be the stronger predictor and better measure of RevPAR growth and bottom-line profitability. The correlations and explained variances are also higher than those reported in prior research. Flow-through ratios range between 1.83 and 1.91 for NOIPAR, and between 1.55 and 1.65 for GOPPAR, across all chain-scales. Research limitations/implications – Limitations of this study include the limited number of years in the study period, limited number of hotels in a competitive set, and self-selection of hotels by the researchers. Practical implications – While ADR and occupancy work in combination to drive profitability, the authors' study shows that ADR is the stronger predictor of profitability. Hotel managers can use flow-through ratios to make financial forecasts, or use them as inputs in valuation models, to forecast future profitability. Originality/value – This paper extends prior research on the relationship between top-line measures and bottom-line profitability and serves to inform lodging owners, operators and asset managers about flow-through ratios, and how these ratios impact hotel profitability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In questa tesi sono stati analizzati alcuni metodi di ricerca per dati 3D. Viene illustrata una panoramica generale sul campo della Computer Vision, sullo stato dell’arte dei sensori per l’acquisizione e su alcuni dei formati utilizzati per la descrizione di dati 3D. In seguito è stato fatto un approfondimento sulla 3D Object Recognition dove, oltre ad essere descritto l’intero processo di matching tra Local Features, è stata fatta una focalizzazione sulla fase di detection dei punti salienti. In particolare è stato analizzato un Learned Keypoint detector, basato su tecniche di apprendimento di machine learning. Quest ultimo viene illustrato con l’implementazione di due algoritmi di ricerca di vicini: uno esauriente (K-d tree) e uno approssimato (Radial Search). Sono state riportate infine alcune valutazioni sperimentali in termini di efficienza e velocità del detector implementato con diversi metodi di ricerca, mostrando l’effettivo miglioramento di performance senza una considerabile perdita di accuratezza con la ricerca approssimata.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Valveless pulsejets are extremely simple aircraft engines; essentially cleverly designed tubes with no moving parts. These engines utilize pressure waves, instead of machinery, for thrust generation, and have demonstrated thrust-to-weight ratios over 8 and thrust specific fuel consumption levels below 1 lbm/lbf-hr – performance levels that can rival many gas turbines. Despite their simplicity and competitive performance, they have not seen widespread application due to extremely high noise and vibration levels, which have persisted as an unresolved challenge primarily due to a lack of fundamental insight into the operation of these engines. This thesis develops two theories for pulsejet operation (both based on electro-acoustic analogies) that predict measurements better than any previous theory reported in the literature, and then uses them to devise and experimentally validate effective noise reduction strategies. The first theory analyzes valveless pulsejets as acoustic ducts with axially varying area and temperature. An electro-acoustic analogy is used to calculate longitudinal mode frequencies and shapes for prescribed area and temperature distributions inside an engine. Predicted operating frequencies match experimental values to within 6% with the use of appropriate end corrections. Mode shapes are predicted and used to develop strategies for suppressing higher modes that are responsible for much of the perceived noise. These strategies are verified experimentally and via comparison to existing models/data for valveless pulsejets in the literature. The second theory analyzes valveless pulsejets as acoustic systems/circuits in which each engine component is represented by an acoustic impedance. These are assembled to form an equivalent circuit for the engine that is solved to find the frequency response. The theory is used to predict the behavior of two interacting pulsejet engines. It is validated via comparison to experiment and data in the literature. The technique is then used to develop and experimentally verify a method for operating two engines in anti-phase without interfering with thrust production. Finally, Helmholtz resonators are used to suppress higher order modes that inhibit noise suppression via anti-phasing. Experiments show that the acoustic output of two resonator-equipped pulsejets operating in anti-phase is 9 dBA less than the acoustic output of a single pulsejet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Principal attrition is a national problem particularly in large urban school districts. Research confirms that schools that serve high proportions of children living in poverty have the most difficulty attracting and retaining competent school leaders. Principals who are at the helm of high poverty schools have a higher turnover rate than the national average of three to four years and higher rates of teacher attrition. This leadership turnover has a fiscal impact on districts and negatively affects student achievement. Research identifies a myriad of reasons why administrators leave the role of principal: some leave the position for retirement; some exit based on difficulty of the role and lack of support; and some simply leave for other opportunities within and outside of the profession altogether. As expectations for both teacher and learner performance drive the national education agenda, understanding how to keep effective principals in their jobs is critical. This study examined the factors that principals in a large urban district identified as potentially affecting their decisions to stay in the position. The study utilized a multi-dimensional, web-based questionnaire to examine principals’ perceptions regarding contributing factors that impact tenure. Results indicated that: • having a quality teaching staff and establishing a positive work-life balance were important stay factors for principals; • having an effective supervisor and collegial support from other principals, were helpful supports; and • having adequate resources, time for long-term planning, and teacher support and resources were critical working conditions. Taken together, these indicators were the most frequently cited factors that would keep principals in their positions. The results were used to create a framework that may serve as a potential guide for addressing principal retention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chemical speciation in foodstuffs is of uttermost importance since it is nowadays recognized that both toxicity and bioavailability of an element depend on the chemical form in which the element is present. Regarding arsenic, inorganic species are classified as carcinogenic while organic arsenic, such as arsenobetaine (AsB) or arsenocholine (AsC), is considered less toxic or even non-toxic. Coupling a High Performance Liquid Chromatographer (HPLC) with an Inductively Coupled Plasma Mass Spectrometer (ICP-MS) combines the power of separation of the first with the selectivity and sensitivity of the second. The present work aims at developing a method, using HPLC-ICP-MS technique, to identify and quantify the chemical species of arsenic present in two food matrices, rice and fish. Two extraction methods, ultrasound and microwave, and different settings were studied. The best method was chosen based on recovery percentages. To ensure that no interconversion of species was occurring, individual spikes of each species of arsenic were made in both matrices and recovery rates were calculated. To guaranty accurate results reference material BCR-627 TUNA FISH, containing certified values for AsB and DMA, was analyzed. Chromatographic separation was achieved using an anion exchange column, HAMILTON-PRP X-100, which allowed to separate the four arsenic species for which standards were available (AsB, dimethylarsenic (DMA), arsenite (AsIII), arsenate (AsV). The mobile phase was chosen based on scientific literature and adjusted to laboratory conditions. Different gradients were studied. As a result we verified that the arsenic species present in both matrices were not the same. While in fish 90% of the arsenic present was in the form of arsenobetaine, in rice 80% of arsenic was present as DMA and 20% as inorganic arsenic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop the a-posteriori error analysis of hp-version interior-penalty discontinuous Galerkin finite element methods for a class of second-order quasilinear elliptic partial differential equations. Computable upper and lower bounds on the error are derived in terms of a natural (mesh-dependent) energy norm. The bounds are explicit in the local mesh size and the local degree of the approximating polynomial. The performance of the proposed estimators within an automatic hp-adaptive refinement procedure is studied through numerical experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Living organisms are open dissipative thermodynamic systems that rely on mechanothermo-electrochemical interactions to survive. Plant physiological processes allow plants to survive by converting solar radiation into chemical energy, and store that energy in form that can be used. Mammals catabolize food to obtain energy that is used to fuel, build and repair the cellular components. The exergy balance is a combined statement of the first and second laws of thermodynamics. It provides insight into the performance of systems. In this paper, exergy balance equations for both mammal’s and green plants are presented and analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evaluation of the quality of the environment is essential for human wellness as pollutants in trace amounts can cause serious health problem. Nitrosamines are a group of compounds that are considered potential carcinogens and can be found in drinking water (as disinfection byproducts), foods, beverages and cosmetics. To monitor the level of these compounds to minimize daily intakes, fast and reliable analytical techniques are required. As these compounds are relatively highly polar, extraction and enrichment from environmental samples (aqueous) are challenging. Also, the trend of analytical techniques toward the reduction of sample size and minimization of organic solvent use demands new methods of analysis. In light of fulfilling these requirements, a new method of online preconcentration tailored to an electrokinetic chromatography is introduced. In this method, electroosmotic flow (EOF) was suppressed to increase the interaction time between analyte and micellar phase, therefore the only force to mobilize the neutral analytes is the interaction of analyte with moving micelles. In absence of EOF, polarity of applied potential was switched (negative or positive) to force (anionic or cationic) micelles to move toward the detector. To avoid the excessive band broadening due to longer analysis time caused by slow moving micelles, auxiliary pressure was introduced to boost the micelle movement toward the detector using an in house designed and built apparatus. Applying the external auxiliary pressure significantly reduced the analysis times without compromising separation efficiency. Parameters, such as type of surfactants, composition of background electrolyte (BGE), type of capillary, matrix effect, organic modifiers, etc., were evaluated in optimization of the method. The enrichment factors for targeted analytes were impressive, particularly; cationic surfactants were shown to be suitable for analysis of nitrosamines due to their ability to act as hydrogen bond donors. Ammonium perfluorooctanoate (APFO) also showed remarkable results in term of peak shapes and number of theoretical plates. It was shown that the separation results were best when a high conductivity sample was paired with a BGE of lower conductivity. Using higher surfactant concentrations (up to 200 mM SDS) than usual (50 mM SDS) for micellar electrokinetic chromatography (MEKC) improved the sweeping. A new method for micro-extraction and enrichment of highly polar neutral analytes (N-Nitrosamines in particular) based on three-phase drop micro-extraction was introduced and its performance studied. In this method, a new device using some easy-to-find components was fabricated and its operation and application demonstrated. Compared to conventional extraction methods (liquid-liquid extraction), consumption of organic solvents and operation times were significantly lower.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Low temperature is one of the main environmental constraints for rice ( Oryza sativa L.) grain production yield. It is known that multi-environment studies play a critical role in the sustainability of rice production across diverse environments. However, there are few studies based on multi-environment studies of rice in temperate climates. The aim was to study the performance of rice plants in cold environments. Four experimental lines and six cultivars were evaluated at three locations during three seasons. The grain yield data were analyzed with ANOVA, mixed models based on the best linear unbiased predictors (BLUPs), and genotype plus Genotype × Environment interaction (GGE) biplot. High genotype contribution (> 25%) was observed in grain yield and the interaction between genotype and locations was not very important. Results also showed that ‘Quila 241319’ was the best experimental line with the highest grain yield (11.3 t ha-1) and grain yield stability across the environments; commercial cultivars were classified as medium grain yield genotypes.