977 resultados para Minimum Channel Problem
Resumo:
The year is 2015 and the startup and tech business ecosphere has never seen more activity. In New York City alone, the tech startup industry is on track to amass $8 billion dollars in total funding – the highest in 7 years (CB Insights, 2015). According to the Kauffman Index of Entrepreneurship (2015), this figure represents just 20% of the total funding in the United States. Thanks to platforms that link entrepreneurs with investors, there are simply more funding opportunities than ever, and funding can be initiated in a variety of ways (angel investors, venture capital firms, crowdfunding). And yet, in spite of all this, according to Forbes Magazine (2015), nine of ten startups will fail. Because of the unpredictable nature of the modern tech industry, it is difficult to pinpoint exactly why 90% of startups fail – but the general consensus amongst top tech executives is that “startups make products that no one wants” (Fortune, 2014). In 2011, author Eric Ries wrote a book called The Lean Startup in attempts to solve this all-too-familiar problem. It was in this book where he developed the framework for The Hypothesis-Driven Entrepreneurship Process, an iterative process that aims at proving a market before actually launching a product. Ries discusses concepts such as the Minimum Variable Product, the smallest set of activities necessary to disprove a hypothesis (or business model characteristic). Ries encourages acting briefly and often: if you are to fail, then fail fast. In today’s fast-moving economy, an entrepreneur cannot afford to waste his own time, nor his customer’s time. The purpose of this thesis is to conduct an in-depth of analysis of Hypothesis-Driven Entrepreneurship Process, in order to test market viability of a reallife startup idea, ShowMeAround. This analysis will follow the scientific Lean Startup approach; for the purpose of developing a functional business model and business plan. The objective is to conclude with an investment-ready startup idea, backed by rigorous entrepreneurial study.
Resumo:
It’s impossible to neglect the changes that internet and e-commerce caused in the retail sector, by increasing customers’ expectations and forcing retailers to adapt the business to the new digital era. Internet is characterized by the increase in accessibility to everyone, which can be good or not so. For instance, luxury products rely on the sense of exclusivity, instead of being accessible to everyone. Hence, internet represents a challenge for luxury brands once, although they are able to provide a fullness service to their customers, they need to maintain the exclusiveness in which luxury is sustained. Consequently, the appearance of omni-channel was more than a challenge for the luxury sector, in particular, given the need to provide a full integrated experience through different channels. The aim of this dissertation is to find out how important is omni-channel, even in the luxury industry, and how it’s actually implemented based on the case of one of the most successful companies on luxury fashion e-commerce industry – Farfetch. Even though the company started in London, its founder is a Portuguese entrepreneur, and it’s in Portugal where most of its employees work, divided in two offices – Guimarães e Porto. Therefore, a literature review was written on relevant concepts and ideas about luxury, e-commerce and the different channels’ approaches. There were formulated five propositions that were after discussed according to the information gathered about the company and its strategies. In the end, it was possible to identify which propositions are in accordance with theory and which are not, as well as understand which are the most important strategies and trends about omni-channel in the luxury fashion e-commerce sector.
Resumo:
Branding Lab
Resumo:
Autor proof
Resumo:
This work presents an improved model to solve the non-emergency patients transport (NEPT) service issues given the new rules recently established in Portugal. The model follows the same principle of the Team Orienteering Problem by selecting the patients to be included in the routes attending the maximum reduction in costs when compared with individual transportation. This model establishes the best sets of patients to be transported together. The model was implemented in AMPL and a compact formulation was solved using NEOS Server. A heuristic procedure based on iteratively solving Orienteering Problems is presented, and this heuristic provides good results in terms of accuracy and computation time. Euclidean instances as well as asymmetric real data gathered from Google maps were used, and the model has a promising performance mainly with asymmetric cost matrices.
Resumo:
This chapter aims at developing a taxonomic framework to classify the studies on the flexible job shop scheduling problem (FJSP). The FJSP is a generalization of the classical job shop scheduling problem (JSP), which is one of the oldest NP-hard problems. Although various solution methodologies have been developed to obtain good solutions in reasonable time for FSJPs with different objective functions and constraints, no study which systematically reviews the FJSP literature has been encountered. In the proposed taxonomy, the type of study, type of problem, objective, methodology, data characteristics, and benchmarking are the main categories. In order to verify the proposed taxonomy, a variety of papers from the literature are classified. Using this classification, several inferences are drawn and gaps in the FJSP literature are specified. With the proposed taxonomy, the aim is to develop a framework for a broad view of the FJSP literature and construct a basis for future studies.
Resumo:
The selective collection of municipal solid waste for recycling is a very complex and expensive process, where a major issue is to perform cost-efficient waste collection routes. Despite the abundance of commercially available software for fleet management, they often lack the capability to deal properly with sequencing problems and dynamic revision of plans and schedules during process execution. Our approach to achieve better solutions for the waste collection process is to model it as a vehicle routing problem, more specifically as a team orienteering problem where capacity constraints on the vehicles are considered, as well as time windows for the waste collection points and for the vehicles. The final model is called capacitated team orienteering problem with double time windows (CTOPdTW).We developed a genetic algorithm to solve routing problems in waste collection modelled as a CTOPdTW. The results achieved suggest possible reductions of logistic costs in selective waste collection.
Resumo:
To solve a health and safety problem on a waste treatment facility, different multicriteria decision methods were used, including the PROV Exponential decision method. Four alternatives and ten attributes were considered. We found a congruent solution, validated by the different methods. The AHP and the PROV Exponential decision method led us to the same options ordering, but the last method reinforced one of the options as being the best performing one, and detached the least performing option. Also, the ELECTRE I method results led to the same ordering which allowed to point the best solution with reasonable confidence. This paper demonstrates the potential of using multicriteria decision methods to support decision making on complex problems such as risk control and accidents prevention.
Resumo:
Dissertação de mestrado em Marketing e Estratégia
Resumo:
The final ATLAS Run 1 measurements of Higgs boson production and couplings in the decay channel H→ZZ∗→ℓ+ℓ−ℓ′+ℓ′−, where ℓ,ℓ′=e or μ, are presented. These measurements were performed using pp collision data corresponding to integrated luminosities of 4.5 fb−1 and 20.3 fb−1 at center-of-mass energies of 7 TeV and 8 TeV, respectively, recorded with the ATLAS detector at the LHC. The H→ZZ∗→4ℓ signal is observed with a significance of 8.1 standard deviations at 125.36 GeV, the combined ATLAS measurement of the Higgs boson mass from the H→γγ and H→ZZ∗→4ℓ channels. The production rate relative to the Standard Model expectation, the signal strength, is measured in four different production categories in the H→ZZ∗→4ℓ channel. The measured signal strength, at this mass, and with all categories combined, is 1.44 +0.40−0.33. The signal strength for Higgs boson production in gluon fusion or in association with tt¯ or bb¯ pairs is found to be 1.7 +0.5−0.4, while the signal strength for vector-boson fusion combined with WH/ZH associated production is found to be 0.3 +1.6−0.9.
Resumo:
This Letter presents a search at the LHC for s-channel single top-quark production in proton-proton collisions at a centre-of-mass energy of 8 TeV. The analyzed data set was recorded by the ATLAS detector and corresponds to an integrated luminosity of 20.3 fb−1. Selected events contain one charged lepton, large missing transverse momentum and exactly two b-tagged jets. A multivariate event classifier based on boosted decision trees is developed to discriminate s-channel single top-quark events from the main background contributions. The signal extraction is based on a binned maximum-likelihood fit of the output classifier distribution. The analysis leads to an upper limit on the s-channel single top-quark production cross-section of 14.6 pb at the 95% confidence level. The fit gives a cross-section of σs=5.0±4.3 pb, consistent with the Standard Model expectation.
Resumo:
This Letter presents a search for a hidden-beauty counterpart of the X(3872) in the mass ranges 10.05--10.31 GeV and 10.40--11.00 GeV, in the channel Xb→π+π−Υ(1S)(→μ+μ−), using 16.2 fb−1 of s√=8 TeV pp collision data collected by the ATLAS detector at the LHC. No evidence for new narrow states is found, and upper limits are set on the product of the Xb cross section and branching fraction, relative to those of the Υ(2S), at the 95% confidence level using the CLS approach. These limits range from 0.8% to 4.0%, depending on mass. For masses above 10.1 GeV, the expected upper limits from this analysis are the most restrictive to date. Searches for production of the Υ(13DJ), Υ(10860), and Υ(11020) states also reveal no significant signals.
Resumo:
The mass of the top quark is measured in a data set corresponding to 4.6 fb−1 of proton--proton collisions with centre-of-mass energy s√=7 TeV collected by the ATLAS detector at the LHC. Events consistent with hadronic decays of top--antitop quark pairs with at least six jets in the final state are selected. The substantial background from multijet production is modelled with data-driven methods that utilise the number of identified b-quark jets and the transverse momentum of the sixth leading jet, which have minimal correlation. The top-quark mass is obtained from template fits to the ratio of three-jet to dijet mass. The three-jet mass is calculated from the three jets of a top-quark decay. Using these three jets the dijet mass is obtained from the two jets of the W boson decay. The top-quark mass obtained from this fit is thus less sensitive to the uncertainty in the energy measurement of the jets. A binned likelihood fit yields a top-quark mass of mt = 175.1 ± 1.4 (stat.) ± 1.2 (syst.) GeV.
Resumo:
Studies of the spin and parity quantum numbers of the Higgs boson in the WW∗→eνμν final state are presented, based on proton--proton collision data collected by the ATLAS detector at the Large Hadron Collider, corresponding to an integrated luminosity of 20.3 fb−1 at a centre-of-mass energy of s√=8 TeV. The Standard Model spin-parity JCP=0++ hypothesis is compared with alternative hypotheses for both spin and CP. The case where the observed resonance is a mixture of the Standard-Model-like Higgs boson and CP-even (JCP=0++) or CP-odd (JCP=0+−) Higgs boson in scenarios beyond the Standard Model is also studied. The data are found to be consistent with the Standard Model prediction and limits are placed on alternative spin and CP hypotheses, including CP mixing in different scenarios.