996 resultados para Feasibility problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates whether low technology driver-only, battery electric commuter vehicles are feasible for New Zealand. Personal passenger transport faces several challenges in the coming decades: depletion of cheap oil reserves, increasing congestion, localised pollution, the need for reduced carbon emissions and the long term goal of sustainability. One way of solving some of these problems could be to introduce low cost, comfortable, energy efficient, driver-only electric vehicles. These would still give the driver a weatherproof, safe and comfortable means of commuting, but at a fraction of the energy and running costs of conventional petrol/diesel cars. To help assess their viability, the performance and energy use of the E-POD electric commuter vehicle is used as a benchmark. The work shows that such a vehicle could be made cheaply, using readily available technology with a range of 180km and a top speed of over 90km/h. The chassis could be made from natural fibre composite materials that might reduce significantly the embedded energy required for its manufacture. The electricity taken from the grid to charge the batteries could be replaced by electricity generated from grid connected photovoltaic panels mounted on the garage roof of the vehicle owner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents hybrid Constraint Programming (CP) and metaheuristic methods for the solution of Large Scale Optimization Problems; it aims at integrating concepts and mechanisms from the metaheuristic methods to a CP-based tree search environment in order to exploit the advantages of both approaches. The modeling and solution of large scale combinatorial optimization problem is a topic which has arisen the interest of many researcherers in the Operations Research field; combinatorial optimization problems are widely spread in everyday life and the need of solving difficult problems is more and more urgent. Metaheuristic techniques have been developed in the last decades to effectively handle the approximate solution of combinatorial optimization problems; we will examine metaheuristics in detail, focusing on the common aspects of different techniques. Each metaheuristic approach possesses its own peculiarities in designing and guiding the solution process; our work aims at recognizing components which can be extracted from metaheuristic methods and re-used in different contexts. In particular we focus on the possibility of porting metaheuristic elements to constraint programming based environments, as constraint programming is able to deal with feasibility issues of optimization problems in a very effective manner. Moreover, CP offers a general paradigm which allows to easily model any type of problem and solve it with a problem-independent framework, differently from local search and metaheuristic methods which are highly problem specific. In this work we describe the implementation of the Local Branching framework, originally developed for Mixed Integer Programming, in a CP-based environment. Constraint programming specific features are used to ease the search process, still mantaining an absolute generality of the approach. We also propose a search strategy called Sliced Neighborhood Search, SNS, that iteratively explores slices of large neighborhoods of an incumbent solution by performing CP-based tree search and encloses concepts from metaheuristic techniques. SNS can be used as a stand alone search strategy, but it can alternatively be embedded in existing strategies as intensification and diversification mechanism. In particular we show its integration within the CP-based local branching. We provide an extensive experimental evaluation of the proposed approaches on instances of the Asymmetric Traveling Salesman Problem and of the Asymmetric Traveling Salesman Problem with Time Windows. The proposed approaches achieve good results on practical size problem, thus demonstrating the benefit of integrating metaheuristic concepts in CP-based frameworks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of linear programming in various areas has increased with the significant improvement of specialized solvers. Linear programs are used as such to model practical problems, or as subroutines in algorithms such as formal proofs or branch-and-cut frameworks. In many situations a certified answer is needed, for example the guarantee that the linear program is feasible or infeasible, or a provably safe bound on its objective value. Most of the available solvers work with floating-point arithmetic and are thus subject to its shortcomings such as rounding errors or underflow, therefore they can deliver incorrect answers. While adequate for some applications, this is unacceptable for critical applications like flight controlling or nuclear plant management due to the potential catastrophic consequences. We propose a method that gives a certified answer whether a linear program is feasible or infeasible, or returns unknown'. The advantage of our method is that it is reasonably fast and rarely answers unknown'. It works by computing a safe solution that is in some way the best possible in the relative interior of the feasible set. To certify the relative interior, we employ exact arithmetic, whose use is nevertheless limited in general to critical places, allowing us to rnremain computationally efficient. Moreover, when certain conditions are fulfilled, our method is able to deliver a provable bound on the objective value of the linear program. We test our algorithm on typical benchmark sets and obtain higher rates of success compared to previous approaches for this problem, while keeping the running times acceptably small. The computed objective value bounds are in most of the cases very close to the known exact objective values. We prove the usability of the method we developed by additionally employing a variant of it in a different scenario, namely to improve the results of a Satisfiability Modulo Theories solver. Our method is used as a black box in the nodes of a branch-and-bound tree to implement conflict learning based on the certificate of infeasibility for linear programs consisting of subsets of linear constraints. The generated conflict clauses are in general small and give good rnprospects for reducing the search space. Compared to other methods we obtain significant improvements in the running time, especially on the large instances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Health risk appraisal is a promising method for health promotion and prevention in older persons. The Health Risk Appraisal for the Elderly (HRA-E) developed in the U.S. has unique features but has not been tested outside the United States. METHODS: Based on the original HRA-E, we developed a scientifically updated and regionally adapted multilingual Health Risk Appraisal for Older Persons (HRA-O) instrument consisting of a self-administered questionnaire and software-generated feed-back reports. We evaluated the practicability and performance of the questionnaire in non-disabled community-dwelling older persons in London (U.K.) (N = 1090), Hamburg (Germany) (N = 804), and Solothurn (Switzerland) (N = 748) in a sub-sample of an international randomised controlled study. RESULTS: Over eighty percent of invited older persons returned the self-administered HRA-O questionnaire. Fair or poor self-perceived health status and older age were correlated with higher rates of non-return of the questionnaire. Older participants and those with lower educational levels reported more difficulty in completing the HRA-O questionnaire as compared to younger and higher educated persons. However, even among older participants and those with low educational level, more than 80% rated the questionnaire as easy to complete. Prevalence rates of risks for functional decline or problems were between 2% and 91% for the 19 HRA-O domains. Participants' intention to change health behaviour suggested that for some risk factors participants were in a pre-contemplation phase, having no short- or medium-term plans for change. Many participants perceived their health behaviour or preventative care uptake as optimal, despite indications of deficits according to the HRA-O based evaluation. CONCLUSION: The HRA-O questionnaire was highly accepted by a broad range of community-dwelling non-disabled persons. It identified a high number of risks and problems, and provided information on participants' intention to change health behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Percutaneous closure of patent foramen ovale (PFO) is generally performed using intra-procedural guidance by transoesophageal (TEE) or intracardiac (ICE) echocardiography. While TEE requires sedation or general anaesthesia, ICE is costly and adds incremental risk, and both imaging modalities lengthen the procedure. METHODS: A total of 825 consecutive patients (age 51 +/- 13 years; 58% male) underwent percutaneous PFO closure solely under fluoroscopic guidance, without intra-procedural echocardiography. The indications for PFO closure were presumed paradoxical embolism in 698 patients (95% cerebral, 5% other locations), an embolic event with concurrent aetiologies in 47, diving in 51, migraine headaches in 13, and other reasons in 16. An atrial septal aneurysm was associated with the PFO in 242 patients (29%). RESULTS: Permanent device implantation failed in two patients (0.2%). There were 18 procedural complications (2.2%), including embolization of the device or parts of it in five patients with successful percutaneous removal in all cases, air embolism with transient symptoms in four patients, pericardial tamponade requiring pericardiocentesis in one patient, a transient ischaemic attack with visual symptoms in one patient, and vascular access site problems in seven patients. There were no long-term sequelae. Contrast TEE at six months showed complete abolition of right-to-left shunt via PFO in 88% of patients, whereas a minimal, moderate or large residual shunt persisted in 7%, 3%, and 2%, respectively. CONCLUSIONS: This study confirms the safety and feasibility of percutaneous PFO closure without intra-procedural echocardiographic guidance in a large cohort of consecutive patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Although well-established for suspected lower limb deep venous thrombosis, an algorithm combining a clinical decision score, d-dimer testing, and ultrasonography has not been evaluated for suspected upper extremity deep venous thrombosis (UEDVT). OBJECTIVE To assess the safety and feasibility of a new diagnostic algorithm in patients with clinically suspected UEDVT. DESIGN Diagnostic management study. (ClinicalTrials.gov: NCT01324037) SETTING: 16 hospitals in Europe and the United States. PATIENTS 406 inpatients and outpatients with suspected UEDVT. MEASUREMENTS The algorithm consisted of the sequential application of a clinical decision score, d-dimer testing, and ultrasonography. Patients were first categorized as likely or unlikely to have UEDVT; in those with an unlikely score and normal d-dimer levels, UEDVT was excluded. All other patients had (repeated) compression ultrasonography. The primary outcome was the 3-month incidence of symptomatic UEDVT and pulmonary embolism in patients with a normal diagnostic work-up. RESULTS The algorithm was feasible and completed in 390 of the 406 patients (96%). In 87 patients (21%), an unlikely score combined with normal d-dimer levels excluded UEDVT. Superficial venous thrombosis and UEDVT were diagnosed in 54 (13%) and 103 (25%) patients, respectively. All 249 patients with a normal diagnostic work-up, including those with protocol violations (n = 16), were followed for 3 months. One patient developed UEDVT during follow-up, for an overall failure rate of 0.4% (95% CI, 0.0% to 2.2%). LIMITATIONS This study was not powered to show the safety of the substrategies. d-Dimer testing was done locally. CONCLUSION The combination of a clinical decision score, d-dimer testing, and ultrasonography can safely and effectively exclude UEDVT. If confirmed by other studies, this algorithm has potential as a standard approach to suspected UEDVT. PRIMARY FUNDING SOURCE None.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSES Geriatric problems frequently go undetected in older patients in emergency departments (EDs), thus increasing their risk of adverse outcomes. We evaluated a novel emergency geriatric screening (EGS) tool designed to detect geriatric problems. BASIC PROCEDURES The EGS tool consisted of short validated instruments used to screen 4 domains (cognition, falls, mobility, and activities of daily living). Emergency geriatric screening was introduced for ED patients 75 years or older throughout a 4-month period. We analyzed the prevalence of abnormal EGS and whether EGS increased the number of EGS-related diagnoses in the ED during the screening, as compared with a preceding control period. MAIN FINDINGS Emergency geriatric screening was performed on 338 (42.5%) of 795 patients presenting during screening. Emergency geriatric screening was unfeasible in 175 patients (22.0%) because of life-threatening conditions and was not performed in 282 (35.5%) for logistical reasons. Emergency geriatric screening took less than 5 minutes to perform in most (85.8%) cases. Among screened patients, 285 (84.3%) had at least 1 abnormal EGS finding. In 270 of these patients, at least 1 abnormal EGS finding did not result in a diagnosis in the ED and was reported for further workup to subsequent care. During screening, 142 patients (42.0%) had at least 1 diagnosis listed within the 4 EGS domains, significantly more than the 29.3% in the control period (odds ratio 1.75; 95% confidence interval, 1.34-2.29; P<.001). Emergency geriatric screening predicted nursing home admission after the in-hospital stay (odds ratio for ≥3 vs <3 abnormal domains 12.13; 95% confidence interval, 2.79-52.72; P=.001). PRINCIPAL CONCLUSIONS The novel EGS is feasible, identifies previously undetected geriatric problems, and predicts determinants of subsequent care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indoor Air Quality (IAQ) can have significant implications for health, productivity, job performance, and operating cost. Professional experience in the field of indoor air quality suggests that high expectations (better than nationally established standards) (American Society of Heating, Refrigerating, and Air-conditioning Engineers (ASHRAE)) of workplace indoor air quality lead to increase air quality complaints. To determine whether there is a positive association between expectations and indoor air quality complaints, a one-time descriptive and analytical cross-sectional pilot study was conducted. Area Safety Liaisons (n = 330) at University of Texas Health Science Center – Houston were asked to answer a questionnaire regarding their expectations of four workplace indoor air quality indicators i.e., (temperature, relative humidity, carbon dioxide, and carbon monoxide) and if they experienced and reported indoor air quality problems. A chi-square test for independence was used to evaluate associations among the variables of interest. The response rate was 54% (n = 177). Results did not show significant associations between expectation and indoor air quality. However, a greater proportion of Area Safety Liaisons who expected indoor air quality indicators to be better than the established standard experienced greater indoor air quality problems. Similarly, a slightly higher proportion of Area Liaisons who expected indoor air quality indicators to be better than the standard reported greater indoor air quality complaints. ^ The findings indicated that a greater proportion of Area Safety Liaisons with high expectations (conditions that are beyond what is considered normal and acceptable by ASHRAE) experienced greater indoor air quality discomfort. This result suggests a positive association between high expectations and experienced and reported indoor air quality complaints. Future studies may be able to address whether the frequency of complaints and resulting investigations can be reduced through information and education about what are acceptable conditions.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main challenges of fuzzy community detection problems is to be able to measure the quality of a fuzzy partition. In this paper, we present an alternative way of measuring the quality of a fuzzy community detection output based on n-dimensional grouping and overlap functions. Moreover, the proposed modularity measure generalizes the classical Girvan–Newman (GN) modularity for crisp community detection problems and also for crisp overlapping community detection problems. Therefore, it can be used to compare partitions of different nature (i.e. those composed of classical, overlapping and fuzzy communities). Particularly, as is usually done with the GN modularity, the proposed measure may be used to identify the optimal number of communities to be obtained by any network clustering algorithm in a given network. We illustrate this usage by adapting in this way a well-known algorithm for fuzzy community detection problems, extending it to also deal with overlapping community detection problems and produce a ranking of the overlapping nodes. Some computational experiments show the feasibility of the proposed approach to modularity measures through n-dimensional overlap and grouping functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information Retrieval systems normally have to work with rather heterogeneous sources, such as Web sites or documents from Optical Character Recognition tools. The correct conversion of these sources into flat text files is not a trivial task since noise may easily be introduced as a result of spelling or typeset errors. Interestingly, this is not a great drawback when the size of the corpus is sufficiently large, since redundancy helps to overcome noise problems. However, noise becomes a serious problem in restricted-domain Information Retrieval specially when the corpus is small and has little or no redundancy. This paper devises an approach which adds noise-tolerance to Information Retrieval systems. A set of experiments carried out in the agricultural domain proves the effectiveness of the approach presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The construction industry has long been considered as highly fragmented and non-collaborative industry. This fragmentation sprouted from complex and unstructured traditional coordination processes and information exchanges amongst all parties involved in a construction project. This nature coupled with risk and uncertainty has pushed clients and their supply chain to search for new ways of improving their business process to deliver better quality and high performing product. This research will closely investigate the need to implement a Digital Nervous System (DNS), analogous to a biological nervous system, on the flow and management of digital information across the project lifecycle. This will be through direct examination of the key processes and information produced in a construction project and how a DNS can provide a well-integrated flow of digital information throughout the project lifecycle. This research will also investigate how a DNS can create a tight digital feedback loop that enables the organisation to sense, react and adapt to changing project conditions. A Digital Nervous System is a digital infrastructure that provides a well-integrated flow of digital information to the right part of the organisation at the right time. It provides the organisation with the relevant and up-to-date information it needs, for critical project issues, to aid in near real-time decision-making. Previous literature review and survey questionnaires were used in this research to collect and analyse data about information management problems of the industry – e.g. disruption and discontinuity of digital information flow due to interoperability issues, disintegration/fragmentation of the adopted digital solutions and paper-based transactions. Results analysis revealed efficient and effective information management requires the creation and implementation of a DNS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An iterative method for the parabolic Cauchy problem in planar domains having a finite number of corners is implemented based on boundary integral equations. At each iteration, mixed well-posed problems are solved for the same parabolic operator. The presence of corner points renders singularities of the solutions to these mixed problems, and this is handled with the use of weight functions together with, in the numerical implementation, mesh grading near the corners. The mixed problems are reformulated in terms of boundary integrals obtained via discretization of the time-derivative to obtain an elliptic system of partial differential equations. To numerically solve these integral equations a Nyström method with super-algebraic convergence order is employed. Numerical results are presented showing the feasibility of the proposed approach. © 2014 IMACS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a web based expert system application that carries out an initial assessment of the feasibility of a web project. The system allows detection of inconsistency problems before design starts, and suggests correcting actions to solve them. The developed system presents important advantages not only for determining the feasibility of a web project but also by acting as a means of communication between the client company and the web development team, making the requirements specification clearer.