950 resultados para Railways, Scheduling, Heuristics, Search Algorithms
Resumo:
INTRODUCTION: This study was developed to evaluate the situation of leprosy in the general population of the municipality of Buriticupu, State of Maranhão, Brazil. METHODS: We used the method of active search to identify new cases from 2008 to 2010. Bacilloscopy of intradermal scrapings was performed in all patients with skin lesions compatible with leprosy, and histopathological examination in those who had doubts on the definition of the clinical form. RESULTS: The study included 19,104 individuals, with 42 patients diagnosed with leprosy after clinical examination, representing a detection rate of 219.84 per 100,000 inhabitants. The predominant clinical presentation was tuberculoid with 24 (57.1%) cases, followed by borderline with 11, indeterminate with four, and lepromatous with three cases. The study also allowed the identification of 81 patients with a history of leprosy and other skin diseases, such as pityriasis versicolor, dermatophytosis, scabies, vitiligo, and skin carcinoma. The binomial test showed that the proportion of cases in the headquarters was significantly higher than that in the villages (p = 0.04), and the generalized exact test showed that there was no association between age and clinical form (p = 0.438) and between age and gender (p = 0.083). CONCLUSIONS: The elevated detection rate defines the city as hyperendemic for leprosy; the active search for cases, as well as the organization of health services, is an important method for disease control.
Resumo:
This paper studies the effects of monetary policy on mutual fund risk taking using a sample of Portuguese fixed-income mutual funds in the 2000-2012 period. Firstly I estimate time-varying measures of risk exposure (betas) for the individual funds, for the benchmark portfolio, as well as for a representative equally-weighted portfolio, through 24-month rolling regressions of a two-factor model with two systematic risk factors: interest rate risk (TERM) and default risk (DEF). Next, in the second phase, using the estimated betas, I try to understand what portion of the risk exposure is in excess of the benchmark (active risk) and how it relates to monetary policy proxies (one-month rate, Taylor residual, real rate and first principal component of a cross-section of government yields and rates). Using this methodology, I provide empirical evidence that Portuguese fixed-income mutual funds respond to accommodative monetary policy by significantly increasing exposure, in excess of their benchmarks, to default risk rate and slightly to interest risk rate as well. I also find that the increase in funds’ risk exposure to gain a boost in return (search-for-yield) is more pronounced following the 2007-2009 global financial crisis, indicating that the current historic low interest rates may incentivize excessive risk taking. My results suggest that monetary policy affects the risk appetite of non-bank financial intermediaries.
Resumo:
This paper attempts to prove that in the years 1735 to 1755 Venice was the birthplace and cradle of Modern architectural theory, generating a major crisis in classical architecture traditionally based on the Vitruvian assumption that it imitates early wooden structures in stone or in marble. According to its rationalist critics such as the Venetian Observant Franciscan friar and architectural theorist Carlo Lodoli (1690-1761) and his nineteenth-century followers, classical architecture is singularly deceptive and not true to the nature of materials, in other words, dishonest and fallacious. This questioning did not emanate from practising architects, but from Lodoli himself– a philosopher and educator of the Venetian patriciate – who had not been trained as an architect. The roots of this crisis lay in a new approach to architecture stemming from the new rationalist philosophy of the Enlightenment age with its emphasis on reason and universal criticism.
Resumo:
Search is now going beyond looking for factual information, and people wish to search for the opinions of others to help them in their own decision-making. Sentiment expressions or opinion expressions are used by users to express their opinion and embody important pieces of information, particularly in online commerce. The main problem that the present dissertation addresses is how to model text to find meaningful words that express a sentiment. In this context, I investigate the viability of automatically generating a sentiment lexicon for opinion retrieval and sentiment classification applications. For this research objective we propose to capture sentiment words that are derived from online users’ reviews. In this approach, we tackle a major challenge in sentiment analysis which is the detection of words that express subjective preference and domain-specific sentiment words such as jargon. To this aim we present a fully generative method that automatically learns a domain-specific lexicon and is fully independent of external sources. Sentiment lexicons can be applied in a broad set of applications, however popular recommendation algorithms have somehow been disconnected from sentiment analysis. Therefore, we present a study that explores the viability of applying sentiment analysis techniques to infer ratings in a recommendation algorithm. Furthermore, entities’ reputation is intrinsically associated with sentiment words that have a positive or negative relation with those entities. Hence, is provided a study that observes the viability of using a domain-specific lexicon to compute entities reputation. Finally, a recommendation system algorithm is improved with the use of sentiment-based ratings and entities reputation.
Resumo:
This research intends to examine if there were significant differences on the brand engagement and on the electronic word of mouth (e-WOM)1 referral intention through Facebook between Generation X and Generation Y (also called millennials). Also, this study intends to examine if there are differences in the motivations that drive these generations to interact with brands through Facebook. Results indicated that Generation Y members consumed more content on Facebook brands’ pages than Generation X. Also, they were more likely to have an e-WOM referral intention as well as being more driven by brand affiliation and opportunity seeking. Finally, currently employed individuals were found to contribute with more content than students. This study fills the gap in the literature by addressing how marketing professionals should market their brand and interact and engage with their customers, based on customers’ generational cohort.
Resumo:
This chapter aims at developing a taxonomic framework to classify the studies on the flexible job shop scheduling problem (FJSP). The FJSP is a generalization of the classical job shop scheduling problem (JSP), which is one of the oldest NP-hard problems. Although various solution methodologies have been developed to obtain good solutions in reasonable time for FSJPs with different objective functions and constraints, no study which systematically reviews the FJSP literature has been encountered. In the proposed taxonomy, the type of study, type of problem, objective, methodology, data characteristics, and benchmarking are the main categories. In order to verify the proposed taxonomy, a variety of papers from the literature are classified. Using this classification, several inferences are drawn and gaps in the FJSP literature are specified. With the proposed taxonomy, the aim is to develop a framework for a broad view of the FJSP literature and construct a basis for future studies.
Resumo:
The authors would like to thank the anonymous reviewers for their valuable comments and suggestions to improve the paper. The authors would like to thank Dr. Elaine DeBock for reviewing the manuscript.
Resumo:
The present paper reports the precipitation process of Al3Sc structures in an aluminum scandium alloy, which has been simulated with a synchronous parallel kinetic Monte Carlo (spkMC) algorithm. The spkMC implementation is based on the vacancy diffusion mechanism. To filter the raw data generated by the spkMC simulations, the density-based clustering with noise (DBSCAN) method has been employed. spkMC and DBSCAN algorithms were implemented in the C language and using MPI library. The simulations were conducted in the SeARCH cluster located at the University of Minho. The Al3Sc precipitation was successfully simulated at the atomistic scale with the spkMC. DBSCAN proved to be a valuable aid to identify the precipitates by performing a cluster analysis of the simulation results. The achieved simulations results are in good agreement with those reported in the literature under sequential kinetic Monte Carlo simulations (kMC). The parallel implementation of kMC has provided a 4x speedup over the sequential version.
Resumo:
Traffic Engineering (TE) approaches are increasingly impor- tant in network management to allow an optimized configuration and resource allocation. In link-state routing, the task of setting appropriate weights to the links is both an important and a challenging optimization task. A number of different approaches has been put forward towards this aim, including the successful use of Evolutionary Algorithms (EAs). In this context, this work addresses the evaluation of three distinct EAs, a single and two multi-objective EAs, in two tasks related to weight setting optimization towards optimal intra-domain routing, knowing the network topology and aggregated traffic demands and seeking to mini- mize network congestion. In both tasks, the optimization considers sce- narios where there is a dynamic alteration in the state of the system, in the first considering changes in the traffic demand matrices and in the latter considering the possibility of link failures. The methods will, thus, need to simultaneously optimize for both conditions, the normal and the altered one, following a preventive TE approach towards robust configurations. Since this can be formulated as a bi-objective function, the use of multi-objective EAs, such as SPEA2 and NSGA-II, came nat- urally, being those compared to a single-objective EA. The results show a remarkable behavior of NSGA-II in all proposed tasks scaling well for harder instances, and thus presenting itself as the most promising option for TE in these scenarios.
Resumo:
Immune systems have been used in the last years to inspire approaches for several computational problems. This paper focus on behavioural biometric authentication algorithms’ accuracy enhancement by using them more than once and with different thresholds in order to first simulate the protection provided by the skin and then look for known outside entities, like lymphocytes do. The paper describes the principles that support the application of this approach to Keystroke Dynamics, an authentication biometric technology that decides on the legitimacy of a user based on his typing pattern captured on he enters the username and/or the password and, as a proof of concept, the accuracy levels of one keystroke dynamics algorithm when applied to five legitimate users of a system both in the traditional and in the immune inspired approaches are calculated and the obtained results are compared.
Resumo:
PhD thesis in Bioengineering
Resumo:
A measurement of spin correlation in tt¯ production is presented using data collected with the ATLAS detector at the Large Hadron Collider in proton-proton collisions at a center-of-mass energy of 8 TeV, corresponding to an integrated luminosity of 20.3 fb−1. The correlation between the top and antitop quark spins is extracted from dilepton tt¯ events by using the difference in azimuthal angle between the two charged leptons in the laboratory frame. In the helicity basis the measured degree of correlation corresponds to Ahelicity=0.38±0.04, in agreement with the Standard Model prediction. A search is performed for pair production of top squarks with masses close to the top quark mass decaying to predominantly right-handed top quarks and a light neutralino, the lightest supersymmetric particle. Top squarks with masses between the top quark mass and 191 GeV are excluded at the 95% confidence level.
Resumo:
A search for a heavy, CP-odd Higgs boson, A, decaying into a Z boson and a 125 GeV Higgs boson, h, with the ATLAS detector at the LHC is presented. The search uses proton–proton collision data at a centre-of-mass energy of 8 TeV corresponding to an integrated luminosity of 20.3 fb−1. Decays of CP-even h bosons to ττ or bb pairs with the Z boson decaying to electron or muon pairs are considered, as well as h→bbh→bb decays with the Z boson decaying to neutrinos. No evidence for the production of an A boson in these channels is found and the 95% confidence level upper limits derived for View the MathML sourceσ(gg→A)×BR(A→Zh)×BR(h→ff¯) are 0.098–0.013 pb for f=τf=τ and 0.57–0.014 pb for f=bf=b in a range of mA=220–1000 GeVmA=220–1000 GeV. The results are combined and interpreted in the context of two-Higgs-doublet models.
Resumo:
A search is presented for the direct pair production of a chargino and a neutralino pp→χ~±1χ~02, where the chargino decays to the lightest neutralino and the W boson, χ~±1→χ~01(W±→ℓ±ν), while the neutralino decays to the lightest neutralino and the 125 GeV Higgs boson, χ~02→χ~01(h→bb/γγ/ℓ±νqq). The final states considered for the search have large missing transverse momentum, an isolated electron or muon, and one of the following: either two jets identified as originating from bottom quarks, or two photons, or a second electron or muon with the same electric charge. The analysis is based on 20.3 fb−1 of s√=8 TeV proton-proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with the Standard Model expectations, and limits are set in the context of a simplified supersymmetric model.