953 resultados para Interval discrete log problem
Resumo:
Autor proof
Resumo:
This work presents an improved model to solve the non-emergency patients transport (NEPT) service issues given the new rules recently established in Portugal. The model follows the same principle of the Team Orienteering Problem by selecting the patients to be included in the routes attending the maximum reduction in costs when compared with individual transportation. This model establishes the best sets of patients to be transported together. The model was implemented in AMPL and a compact formulation was solved using NEOS Server. A heuristic procedure based on iteratively solving Orienteering Problems is presented, and this heuristic provides good results in terms of accuracy and computation time. Euclidean instances as well as asymmetric real data gathered from Google maps were used, and the model has a promising performance mainly with asymmetric cost matrices.
Resumo:
This chapter aims at developing a taxonomic framework to classify the studies on the flexible job shop scheduling problem (FJSP). The FJSP is a generalization of the classical job shop scheduling problem (JSP), which is one of the oldest NP-hard problems. Although various solution methodologies have been developed to obtain good solutions in reasonable time for FSJPs with different objective functions and constraints, no study which systematically reviews the FJSP literature has been encountered. In the proposed taxonomy, the type of study, type of problem, objective, methodology, data characteristics, and benchmarking are the main categories. In order to verify the proposed taxonomy, a variety of papers from the literature are classified. Using this classification, several inferences are drawn and gaps in the FJSP literature are specified. With the proposed taxonomy, the aim is to develop a framework for a broad view of the FJSP literature and construct a basis for future studies.
Resumo:
The selective collection of municipal solid waste for recycling is a very complex and expensive process, where a major issue is to perform cost-efficient waste collection routes. Despite the abundance of commercially available software for fleet management, they often lack the capability to deal properly with sequencing problems and dynamic revision of plans and schedules during process execution. Our approach to achieve better solutions for the waste collection process is to model it as a vehicle routing problem, more specifically as a team orienteering problem where capacity constraints on the vehicles are considered, as well as time windows for the waste collection points and for the vehicles. The final model is called capacitated team orienteering problem with double time windows (CTOPdTW).We developed a genetic algorithm to solve routing problems in waste collection modelled as a CTOPdTW. The results achieved suggest possible reductions of logistic costs in selective waste collection.
Resumo:
To solve a health and safety problem on a waste treatment facility, different multicriteria decision methods were used, including the PROV Exponential decision method. Four alternatives and ten attributes were considered. We found a congruent solution, validated by the different methods. The AHP and the PROV Exponential decision method led us to the same options ordering, but the last method reinforced one of the options as being the best performing one, and detached the least performing option. Also, the ELECTRE I method results led to the same ordering which allowed to point the best solution with reasonable confidence. This paper demonstrates the potential of using multicriteria decision methods to support decision making on complex problems such as risk control and accidents prevention.
Resumo:
Corrosion of the steel reinforced concrete elements is one of the common pathologies that limits the long-term performance of urban infrastructures. This problem causes the loss of structural serviceability by decreasing the concrete-steel bond strength and reducing the cross section of the reinforcements. The present study introduces a new system for developing free-corrosion resistance prefabricated manhole covers for applications in the aggressive environments, i.e. wastewater collector systems, sewer systems, stormwater systems, etc. Fibre reinforced cement composites were applied in this system in order to suppress the corrodible steel mesh and maintain the structural ductility as well. Application of fibre reinforced polymer (FRP) system is adopted as the additional solution for increasing the load carrying capacity of these elements without concerns about corrosion. The effectiveness of the applied strategy in developing the manhole covers in terms of load carrying capacity and failure mode is evaluated in this research. Furthermore, this paper discusses a FEM-based simulation, aiming to address the possibility of calibrating the constitutive model parameters related to fracture modes I and II.
Resumo:
This paper presents the findings of an experimental campaign that was conducted to investigate the seismic behaviour of log houses. A two-storey log house designed by the Portuguese company Rusticasa® was subjected to a series of shaking table tests at LNEC, Lisbon, Portugal. The paper contains the description of the geometry and construction of the house and all the aspects related to the testing procedure, namely the pre-design, the setup, instrumentation and the testing process itself. The shaking table tests were carried out with a scaled spectrum of the Montenegro (1979) earthquake, at increasing levels of PGA, starting from 0.07g, moving on to 0.28g and finally 0.5g. The log house did not suffer any major damage and remained in working condition throughout the entire process. The preliminary analysis of the overall behaviour of the log house is also discussed.
Resumo:
The dearth of knowledge on the load resistance mechanisms of log houses and the need for developing numerical models that are capable of simulating the actual behaviour of these structures has pushed efforts to research the relatively unexplored aspects of log house construction. The aim of the research that is presented in this paper is to build a working model of a log house that will contribute toward understanding the behaviour of these structures under seismic loading. The paper presents the results of a series of shaking table tests conducted on a log house and goes on to develop a numerical model of the tested house. The finite element model has been created in SAP2000 and validated against the experimental results. The modelling assumptions and the difficulties involved in the process have been described and, finally, a discussion on the effects of the variation of different physical and material parameters on the results yielded by the model has been drawn up.
Resumo:
The Childhood protection is a subject with high value for the society, but, the Child Abuse cases are difficult to identify. The process from suspicious to accusation is very difficult to achieve. It must configure very strong evidences. Typically, Health Care services deal with these cases from the beginning where there are evidences based on the diagnosis, but they aren’t enough to promote the accusation. Besides that, this subject it’s highly sensitive because there are legal aspects to deal with such as: the patient privacy, paternity issues, medical confidentiality, among others. We propose a Child Abuses critical knowledge monitor system model that addresses this problem. This decision support system is implemented with a multiple scientific domains: to capture of tokens from clinical documents from multiple sources; a topic model approach to identify the topics of the documents; knowledge management through the use of ontologies to support the critical knowledge sensibility concepts and relations such as: symptoms, behaviors, among other evidences in order to match with the topics inferred from the clinical documents and then alert and log when clinical evidences are present. Based on these alerts clinical personnel could analyze the situation and take the appropriate procedures.
Resumo:
Dissertação de mestrado em redes e serviços telemáticos
Resumo:
A transformação logarítmica das relações bivariadas no cálculo das normas e dos índices do sistema integrado de diagnose e recomendação de nutrientes (DRIS) tem sido sugerida como uma forma de melhorar a acurácia do sistema, principalmente por diminuir a inconsistência na distribuição de freqüência entre as formas de expressão direta e inversa de uma mesma relação. Neste sentido, o objetivo deste trabalho foi avaliar o uso de relações log-transformadas entre diferentes populações de referência. Amostras foliares de cupuaçu foram coletadas de 153 pomares comerciais, cuja idade das plantas variou de 5 a 18 anos, cultivados em monocultivo ou sistemas agroflorestais, obtendo-se para cada relação nutricional entre os nutrientes N, P, K, Ca, Mg, Fe, Cu, Zn, e Mn as normas DRIS bivariadas log-transformadas e não transformadas, obtidas para o conjunto da população e para condições específicas. Os resultados mostraram que as relações log-transformadas contribuem para uma maior consistência dos resultados entre as formas direta e inversa entre diferentes normas DRIS.
Resumo:
The Symbol Digit Modalities Test (SDMT) is a widely used instrument to assess information processing speed, attention, visual scanning, and tracking. Considering that repeated evaluations are a common need in neuropsychological assessment routines, we explored test–retest reliability and practice effects of two alternate SDMT forms with a short inter-assessment interval. A total of 123 university students completed the written SDMT version in two different time points separated by a 150-min interval. Half of the participants accomplished the same form in both occasions, while the other half filled different forms. Overall, reasonable test–retest reliabilities were found (r = .70), and the subjects that completed the same form revealed significant practice effects (p < .001, dz = 1.61), which were almost non-existent in those filling different forms. These forms were found to be moderately reliable and to elicit a similar performance across participants, suggesting their utility in repeated cognitive assessments when brief inter-assessment intervals are required.
Resumo:
A precise estimation of the postmortem interval (PMI) is one of the most important topics in forensic pathology. However, the PMI estimation is based mainly on the visual observation of cadaverous pheno- mena (e.g. algor, livor and rigor mortis) and on alternative methods such as thanatochemistry that remain relatively imprecise. The aim of this in vitro study was to evaluate the kinetic alterations of several bio- chemical parameters (i.e. proteins, enzymes, substrates, electrolytes and lipids) during putrefaction of human blood. For this purpose, we performed kinetic biochemical analysis during a 264 hour period. The results showed a significant linear correlation between total and direct bilirubin, urea, uric acid, transferrin, immunoglobulin M (IgM), creatine kinase (CK), aspartate transaminase (AST), calcium and iron with the time of blood putrefaction. These parameters allowed us to develop two mathematical models that may have predictive values and become important complementary tools of traditional methods to achieve a more accurate PMI estimation
Resumo:
OBJECTIVE: To determine in arrhythmogenic right ventricular cardiomyopathy the value of QT interval dispersion for identifying the induction of sustained ventricular tachycardia in the electrophysiological study or the risk of sudden cardiac death. METHODS: We assessed QT interval dispersion in the 12-lead electrocardiogram of 26 patients with arrhythmogenic right ventricular cardiomyopathy. We analyzed its association with sustained ventricular tachycardia and sudden cardiac death, and in 16 controls similar in age and sex. RESULTS: (mean ± SD). QT interval dispersion: patients = 53.8±14.1ms; control group = 35.0±10.6ms, p=0.001. Patients with induction of ventricular tachycardia: 52.5±13.8ms; without induction of ventricular tachycardia: 57.5±12.8ms, p=0.420. In a mean follow-up period of 41±11 months, five sudden cardiac deaths occurred. QT interval dispersion in this group was 62.0±17.8, and in the others it was 51.9±12.8ms, p=0.852. Using a cutoff > or = 60ms to define an increase in the degree of the QT interval dispersion, we were able to identify patients at risk of sudden cardiac death with a sensitivity of 60%, a specificity of 57%, and positive and negative predictive values of 25% and 85%, respectively. CONCLUSION: Patients with arrhythmogenic right ventricular cardiomyopathy have a significant increase in the degree of QT interval dispersion when compared with the healthy population. However it, did not identify patients with induction of ventricular tachycardia in the electrophysiological study, showing a very low predictive value for defining the risk of sudden cardiac death in the population studied.
Resumo:
OBJECTIVE: Parasympathetic dysfunction is an independent risk factor in individuals with coronary artery disease, and cholinergic stimulation is a potential therapeutical option. We determined the effects of pyridostigmine bromide, a reversible anticholinesterase agent, on electrocardiographic variables of healthy individuals. METHODS: We carried out a cross-sectional, double blind, randomized, placebo-controlled study. We obtained electrocardiographic tracings in 12 simultaneous leads of 10 healthy young individuals at rest before and after oral administration of 45 mg of pyridostigmine or placebo. RESULTS: Pyridostigmine increased RR intervals (before: 886±27 ms vs after: 1054±37 ms) and decreased QTc dispersion (before: 72±9ms vs after: 45±3ms), without changing other electrocardiographic variables (PR segment, QT interval, QTc, and QT dispersion). CONCLUSION: Bradycardia and the reduction in QTc dispersion induced by pyridostigmine may effectively represent a protective mechanism if these results can be reproduced in individuals with cardiovascular diseases.