962 resultados para heory of constraints


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to external constraints (opposed by the market and legal system) and internal changes nonprofit organizations have been converting to for-profit entities combining commercial revenue and social value creation. To create an understanding of the conversion process considering its challenges, the reasons, the decision-making process and key success factors of a conversion are examined. Therefore, a two-step research procedure is used combining literature research and a multiple case study approach based on expert interviews with known companies. The outcome is a helpful guideline (including a decision matrix) for social entrepreneurs that might face a conversion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use a new data set to study the determinants of the performance of open–end actively managed equity mutual funds in 27 countries. We find that mutual funds underperform the market overall. The results show important differences in the determinants of fund performance in the USA and elsewhere in the world. The US evidence of diminishing returns to scale is not a universal truth as the performance of funds located outside the USA and funds that invest overseas is not negatively affected by scale. Our findings suggest that the adverse scale effects in the USA are related to liquidity constraints faced by funds that, by virtue of their style, have to invest in small and domestic stocks. Country characteristics also explain fund performance. Funds located in countries with liquid stock markets and strong legal institutions display better performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many municipal activities require updated large-scale maps that include both topographic and thematic information. For this purpose, the efficient use of very high spatial resolution (VHR) satellite imagery suggests the development of approaches that enable a timely discrimination, counting and delineation of urban elements according to legal technical specifications and quality standards. Therefore, the nature of this data source and expanding range of applications calls for objective methods and quantitative metrics to assess the quality of the extracted information which go beyond traditional thematic accuracy alone. The present work concerns the development and testing of a new approach for using technical mapping standards in the quality assessment of buildings automatically extracted from VHR satellite imagery. Feature extraction software was employed to map buildings present in a pansharpened QuickBird image of Lisbon. Quality assessment was exhaustive and involved comparisons of extracted features against a reference data set, introducing cartographic constraints from scales 1:1000, 1:5000, and 1:10,000. The spatial data quality elements subject to evaluation were: thematic (attribute) accuracy, completeness, and geometric quality assessed based on planimetric deviation from the reference map. Tests were developed and metrics analyzed considering thresholds and standards for the large mapping scales most frequently used by municipalities. Results show that values for completeness varied with mapping scales and were only slightly superior for scale 1:10,000. Concerning the geometric quality, a large percentage of extracted features met the strict topographic standards of planimetric deviation for scale 1:10,000, while no buildings were compliant with the specification for scale 1:1000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cerebellum floccular complex lobes (FCLs) are housed in the FCL fossa of the periotic complex. There is experimental evidence indicating that the FCLs integrate visual and vestibular information, responsible for the vestibulo-ocular reflex, vestibulo-collic reflex, smooth pursuit and gaze holding. Thus, the behavior of extinct animals has been correlated with FCLs dimension in multiple paleoneuroanatomy studies. Here I analyzed braincase endocasts of a representative sample of Mammalia (48 species) and Aves (59 species) rendered using tomography and image segmentation and tested statistical correlations between the floccular complex volume, ecological and behavioral traits to assess various previously formulated paleobiological speculations. My results demonstrate: 1) there is no significant correlation between relative FCL volume and body mass; 2) there is no significant correlation between relative FCL and optic lobes size in birds; 3) average relative FCL size is larger in diurnal than in nocturnal birds but there is no statistically significant difference in mammals; 4) feeding strategies are related with different FCL size patterns in birds, but not in mammals; 5) locomotion type is not related with relative FCL size in mammals; 6) agility is not significantly correlated with FCL size in mammals. I conclude that, despite the apparent relation between FCL size and ecology in birds, the cerebellum of tetrapods is a highly plastic structure and may be adapted to control different functions across different taxonomic levels. For example, the european mole (Talpa europaea) which is fossorial and practically blind, has a FCL fossae relative size larger than those of bats, which are highly maneuverable. Therefore, variation in FCL size may be better explained by a combination of multiple factors with relation to anatomical and phylogenetic evolutionary constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work studies fuel retail firms’ strategic behavior in a two-dimensional product differentiation framework. Following the mandatory provision of “low-cost” fuel we consider that capacity constraints force firms to eliminate of one the previously offered qualities. Firms play a two-stage game choosing fuel qualities from three possibilities (low-cost, medium quality and high quality fuel) and then prices having exogenous opposite locations. In the highest level of consumers’ heterogeneity, a subgame perfect Nash equilibrium exists in which firms both choose minimum quality differentiation. Consumers’ are worse off if no differentiation occurs in medium and high qualities. The effect over prices from the mandatory “low-cost” fuel law is ambiguous.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the impact of cross-delisting on firms’ financial constraints and investment sensitivities. We find that firms that cross-delisted from a U.S. stock exchange face stronger post-delisting financial constraints than their cross-listed counterparts, as measured by investment-to-cash flow sensitivity. Following a delisting, the sensitivity of investment-to-cash flow increases significantly and firms also tend to save more cash out of cash flows. Moreover, this increase appears to be primarily driven by informational frictions that constrain access to external financing. We document that information asymmetry problems are stronger for firms from countries with weaker shareholders protection and for firms from less developed capital markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The kinetics of GnP dispersion in polypropylene melt was studied using a prototype small scale modular extensional mixer. Its modular nature enabled the sequential application of a mixing step, melt relaxation, and a second mixing step. The latter could reproduce the flow conditions on the first mixing step, or generate milder flow conditions. The effect of these sequences of flow constraints upon GnP dispersion along the mixer length was studied for composites with 2 and 10 wt.% GnP. The samples collected along the first mixing zone showed a gradual decrease of number and size of GnP agglomerates, at a rate that was independent of the flow conditions imposed to the melt, but dependent on composition. The relaxation zone induced GnP re-agglomeration, and the application of a second mixing step caused variable dispersion results that were largely dependent on the hydrodynamic stresses generated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, many P2P applications proliferate in the Internet. The attractiveness of many of these systems relies on the collaborative approach used to exchange large resources without the dependence and associated constraints of centralized approaches where a single server is responsible to handle all the requests from the clients. As consequence, some P2P systems are also interesting and cost-effective approaches to be adopted by content-providers and other Internet players. However, there are several coexistence problems between P2P applications and In- ternet Service Providers (ISPs) due to the unforeseeable behavior of P2P traffic aggregates in ISP infrastructures. In this context, this work proposes a collaborative P2P/ISP system able to underpin the development of novel Traffic Engi- neering (TE) mechanisms contributing for a better coexistence between P2P applications and ISPs. Using the devised system, two TE methods are described being able to estimate and control the impact of P2P traffic aggregates on the ISP network links. One of the TE methods allows that ISP administrators are able to foresee the expected impact that a given P2P swarm will have in the underlying network infrastructure. The other TE method enables the definition of ISP friendly P2P topologies, where specific network links are protected from P2P traffic. As result, the proposed system and associated mechanisms will contribute for improved ISP resource management tasks and to foster the deployment of innovative ISP-friendly systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search is performed for Higgs bosons produced in association with top quarks using the diphoton decay mode of the Higgs boson. Selection requirements are optimized separately for leptonic and fully hadronic final states from the top quark decays. The dataset used corresponds to an integrated luminosity of 4.5 fb−1 of proton--proton collisions at a center-of-mass energy of 7 TeV and 20.3 fb−1 at 8 TeV recorded by the ATLAS detector at the CERN Large Hadron Collider. No significant excess over the background prediction is observed and upper limits are set on the tt¯H production cross section. The observed exclusion upper limit at 95% confidence level is 6.7 times the predicted Standard Model cross section value. In addition, limits are set on the strength of the Yukawa coupling between the top quark and the Higgs boson, taking into account the dependence of the tt¯H and tH cross sections as well as the H→γγ branching fraction on the Yukawa coupling. Lower and upper limits at 95% confidence level are set at −1.3 and +8.0 times the Yukawa coupling strength in the Standard Model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[Extrat] The answer to the social and economic challenges that it is assumed literacy (or its lack) puts to developed countries deeply concerns public policies of governments namely those of the OECD area. In the last decades, these concerns gave origin to several and diverse monitoring devices, initiatives and programmes for reading (mainly) development, putting a strong stress on education. UNESCO (2006, p. 6), for instance, assumes that the literacy challenge can only be met raising the quality of primary and secondary education and intensifying programmes explicitly oriented towards youth and adult literacy. (...)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for pair production of vector-like quarks, both up-type (T) and down-type (B), as well as for four-top-quark production, is presented. The search is based on pp collisions at s√=8 TeV recorded in 2012 with the ATLAS detector at the CERN Large Hadron Collider and corresponding to an integrated luminosity of 20.3 fb−1. Data are analysed in the lepton-plus-jets final state, characterised by an isolated electron or muon with high transverse momentum, large missing transverse momentum and multiple jets. Dedicated analyses are performed targeting three cases: a T quark with significant branching ratio to a W boson and a b-quark (TT¯→Wb+X), and both a T quark and a B quark with significant branching ratio to a Higgs boson and a third-generation quark (TT¯→Ht+X and BB¯→Hb+X respectively). No significant excess of events above the Standard Model expectation is observed, and 95% CL lower limits are derived on the masses of the vector-like T and B quarks under several branching ratio hypotheses assuming contributions from T→Wb, Zt, Ht and B→Wt, Zb, Hb decays. The 95% CL observed lower limits on the T quark mass range between 715 GeV and 950 GeV for all possible values of the branching ratios into the three decay modes, and are the most stringent constraints to date. Additionally, the most restrictive upper bounds on four-top-quark production are set in a number of new physics scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado em Engenharia Industrial

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The morphological evolution of the city of Braga has been the subject of several studies focusing on different urban areas in different periods. Using the accumulated knowledge provided by the available archaeological, historical and iconographic data of Braga, from the Roman times to the nineteenth century, we intend to present a working methodology for 3D representation of urban areas and its evolution, using the CityEngine ESRI tool. Different types of graphic and cartographic data will be integrated in an archaeological information system for the characterization of urban buildings. Linking this information system to the rules of characterization of urban spaces through the CityEngine tool, we can create the 3D urban spaces and their changes. The building characterization rules include several parameters of architectural elements that can be dynamically changed according the latest information. This methodology will be applied to the best known areas within of the city allowing the creation of different and dynamic layouts. Considerations about the concepts, challenges and constraints of using the CityEngine tool for recording and representing urban evolution knowledge will be discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, the quality of the Indonesian national road network is inadequate due to several constraints, including overcapacity and overloaded trucks. The high deterioration rate of the road infrastructure in developing countries along with major budgetary restrictions and high growth in traffic have led to an emerging need for improving the performance of the highway maintenance system. However, the high number of intervening factors and their complex effects require advanced tools to successfully solve this problem. The high learning capabilities of Data Mining (DM) are a powerful solution to this problem. In the past, these tools have been successfully applied to solve complex and multi-dimensional problems in various scientific fields. Therefore, it is expected that DM can be used to analyze the large amount of data regarding the pavement and traffic, identify the relationship between variables, and provide information regarding the prediction of the data. In this paper, we present a new approach to predict the International Roughness Index (IRI) of pavement based on DM techniques. DM was used to analyze the initial IRI data, including age, Equivalent Single Axle Load (ESAL), crack, potholes, rutting, and long cracks. This model was developed and verified using data from an Integrated Indonesia Road Management System (IIRMS) that was measured with the National Association of Australian State Road Authorities (NAASRA) roughness meter. The results of the proposed approach are compared with the IIRMS analytical model adapted to the IRI, and the advantages of the new approach are highlighted. We show that the novel data-driven model is able to learn (with high accuracy) the complex relationships between the IRI and the contributing factors of overloaded trucks