894 resultados para test case optimization
Resumo:
We propose a positive, accurate moment closure for linear kinetic transport equations based on a filtered spherical harmonic (FP_N) expansion in the angular variable. The FP_N moment equations are accurate approximations to linear kinetic equations, but they are known to suffer from the occurrence of unphysical, negative particle concentrations. The new positive filtered P_N (FP_N+) closure is developed to address this issue. The FP_N+ closure approximates the kinetic distribution by a spherical harmonic expansion that is non-negative on a finite, predetermined set of quadrature points. With an appropriate numerical PDE solver, the FP_N+ closure generates particle concentrations that are guaranteed to be non-negative. Under an additional, mild regularity assumption, we prove that as the moment order tends to infinity, the FP_N+ approximation converges, in the L2 sense, at the same rate as the FP_N approximation; numerical tests suggest that this assumption may not be necessary. By numerical experiments on the challenging line source benchmark problem, we confirm that the FP_N+ method indeed produces accurate and non-negative solutions. To apply the FP_N+ closure on problems at large temporal-spatial scales, we develop a positive asymptotic preserving (AP) numerical PDE solver. We prove that the propose AP scheme maintains stability and accuracy with standard mesh sizes at large temporal-spatial scales, while, for generic numerical schemes, excessive refinements on temporal-spatial meshes are required. We also show that the proposed scheme preserves positivity of the particle concentration, under some time step restriction. Numerical results confirm that the proposed AP scheme is capable for solving linear transport equations at large temporal-spatial scales, for which a generic scheme could fail. Constrained optimization problems are involved in the formulation of the FP_N+ closure to enforce non-negativity of the FP_N+ approximation on the set of quadrature points. These optimization problems can be written as strictly convex quadratic programs (CQPs) with a large number of inequality constraints. To efficiently solve the CQPs, we propose a constraint-reduced variant of a Mehrotra-predictor-corrector algorithm, with a novel constraint selection rule. We prove that, under appropriate assumptions, the proposed optimization algorithm converges globally to the solution at a locally q-quadratic rate. We test the algorithm on randomly generated problems, and the numerical results indicate that the combination of the proposed algorithm and the constraint selection rule outperforms other compared constraint-reduced algorithms, especially for problems with many more inequality constraints than variables.
Resumo:
Introduction Cerebral misery perfusion represents a failure of cerebral autoregulation. It is animportant differential diagnosis in post-stroke patients presenting with collapses in the presence of haemodynamically significant cerebrovascular stenosis. This is particularly the case when cortical or internal watershed infarcts are present. When this condition occurs, further investigation should be done immediately. Case presentation A 50-year-old Caucasian man presented with a stroke secondary to complete occlusion of his left internal carotid artery. He went on to suffer recurrent seizures. Neuroimaging demonstrated numerous new watershed-territory cerebral infarcts. No source of arterial thromboembolism was demonstrable. Hypercapnic blood-oxygenation-level-dependent-contrast functional magnetic resonance imaging was used to measure his cerebrovascular reserve capacity. The findings were suggestive of cerebral misery perfusion. Conclusions Blood-oxygenation-level-dependent-contrast functional magnetic resonance imaging allows the inference of cerebral misery perfusion. This procedure is cheaper and more readily available than positron emission tomography imaging, which is the current gold standard diagnostic test. The most evaluated treatment for cerebral misery perfusion is extracranial-intracranial bypass. Although previous trials of this have been unfavourable, the results of new studies involving extracranial-intracranial bypass in high-risk patients identified during cerebral perfusion imaging are awaited. Cerebral misery perfusion is an important and under-recognized condition in which emerging imaging and treatment modalities present the possibility of practical and evidence-based management in the near future. Physicians should thus be aware of this disorder and of recent developments in diagnostic tests that allow its detection.
Resumo:
A Bayesian optimization algorithm for the nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse’s assignment. Unlike our previous work that used GAs to implement implicit learning, the learning in the proposed algorithm is explicit, i.e. eventually, we will be able to identify and mix building blocks directly. The Bayesian optimization algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, a new rule string has been obtained. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed approach might be suitable for other scheduling problems.
Resumo:
Agents offer a new and exciting way of understanding the world of work. In this paper we describe the development of agent-based simulation models, designed to help to understand the relationship between people management practices and retail performance. We report on the current development of our simulation models which includes new features concerning the evolution of customers over time. To test the features we have conducted a series of experiments dealing with customer pool sizes, standard and noise reduction modes, and the spread of customers’ word of mouth. To validate and evaluate our model, we introduce new performance measure specific to retail operations. We show that by varying different parameters in our model we can simulate a range of customer experiences leading to significant differences in performance measures. Ultimately, we are interested in better understanding the impact of changes in staff behavior due to changes in store management practices. Our multi-disciplinary research team draws upon expertise from work psychologists and computer scientists. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents offer potential for fostering sustainable organizational capabilities in the future.
Resumo:
O aumento da carga física do jogo de futebol provocou uma maior exigência e desenvolvimento na condição física dos jogadores e por inerência, nos árbitros. Assim o presente estudo procurou identificar e desenvolver um teste para a avaliação dos árbitros de futebol. Foi realizada uma análise sistemática para identificação e descrição da produção científica na área da arbitragem no sentido de sustentar o argumento de insuficiência dos testes vigentes e propor o novo teste que denominámos ETSOR. Após esta, foi realizada uma aplicação piloto com recurso ao método de estudo de caso para testagem do ETSOR. Os resultados revelaram que existe uma dispersão nas formas e conteúdos abordados face à caracterização do árbitro de futebol de 11. A partir do método de meta-análise, é apresentada uma proposta de categorização dos conteúdos. Os resultados revelaram também que o teste FIFA não identifica as intensidades irregulares que decorrem das situações do jogo, nem representa a uma distribuição das intensidades dos esforços dos árbitros nas situações de jogo. O Teste ETSOR, como teste ecológico, capta em termos de densidade, de distribuição, de variação da potência e de resistência, os esforços dos árbitros nas situações de jogo, como testa a características das intensidades máximas da atividade do árbitro. Por último, os resultados reforçaram que este processo que se deve estender de forma periodizada ao longo de cada época tornando-se útil, na medida em que permite a otimização e monitorização da prestação do árbitro.
Resumo:
Production companies use raw materials to compose end-products. They often make different products with the same raw materials. In this research, the focus lies on the production of two end-products consisting of (partly) the same raw materials as cheap as possible. Each of the products has its own demand and quality requirements consisting of quadratic constraints. The minimization of the costs, given the quadratic constraints is a global optimization problem, which can be difficult because of possible local optima. Therefore, the multi modal character of the (bi-) blend problem is investigated. Standard optimization packages (solvers) in Matlab and GAMS were tested on their ability to solve the problem. In total 20 test cases were generated and taken from literature to test solvers on their effectiveness and efficiency to solve the problem. The research also gives insight in adjusting the quadratic constraints of the problem in order to make a robust problem formulation of the bi-blend problem.
Resumo:
Nowadays, the development of the photovoltaic (PV) technology is consolidated as a source of renewable energy. The research in the topic of maximum improvement on the energy efficiency of the PV plants is today a major challenge. The main requirement for this purpose is to know the performance of each of the PV modules that integrate the PV field in real time. In this respect, a PLC communications based Smart Monitoring and Communications Module, which is able to monitor at PV level their operating parameters, has been developed at the University of Malaga. With this device you can check if any of the panels is suffering any type of overriding performance, due to a malfunction or partial shadowing of its surface. Since these fluctuations in electricity production from a single panel affect the overall sum of all panels that conform a string, it is necessary to isolate the problem and modify the routes of energy through alternative paths in case of PV panels array configuration.
Resumo:
Purpose: To develop and optimise some variables that influence fluoxetine orally disintegrating tablets (ODTs) formulation. Methods: Fluoxetine ODTs tablets were prepared using direct compression method. Three-factor, 3- level Box-Behnken design was used to optimize and develop fluoxetine ODT formulation. The design suggested 15 formulations of different lubricant concentration (X1), lubricant mixing time (X2), and compression force (X3) and then their effect was monitored on tablet weight (Y1), thickness (Y2), hardness (Y3), % friability (Y4), and disintegration time (Y5). Results: All powder blends showed acceptable flow properties, ranging from good to excellent. The disintegration time (Y5) was affected directly by lubricant concentration (X1). Lubricant mixing time (X2) had a direct effect on tablet thickness (Y2) and hardness (Y3), while compression force (X3) had a direct impact on tablet hardness (Y3), % friability (Y4) and disintegration time (Y5). Accordingly, Box-Behnken design suggested an optimized formula of 0.86 mg (X1), 15.3 min (X2), and 10.6 KN (X3). Finally, the prediction error percentage responses of Y1, Y2, Y3, Y4, and Y5 were 0.31, 0.52, 2.13, 3.92 and 3.75 %, respectively. Formula 4 and 8 achieved 90 % of drug release within the first 5 min of dissolution test. Conclusion: Fluoxetine ODT formulation has been developed and optimized successfully using Box- Behnken design and has also been manufactured efficiently using direct compression technique.
Resumo:
The financial and economic crisis which originated in 2008 has had a severe impact on the population of the Southern European countries. The economic policies of austerity and public deficit control, as well as the neo-liberal and conservative social policies are redefining the public social protection systems, in particular the Social Services. In order to get to understand the current situation, we shall explain how the Social Services were developed in Spain and analyse the causes and consequences of the economic crisis. The working hypothesis is that the greater the increase on the population’s needs, the more developed the Social Services should be. We carried out a descriptive analysis of the situation as far as the social impacts of the crisis per region are concerned. We tested the hypothesis through a parametric model of analysis of variance (one-way ANOVA) triangulating with the non-parametric Kruscal-Wallis test. The working hypothesis failed. The regions with better developed Social Services show a lower level of poverty and social exclusion. The challenges that the public Social Services system faces in times of crisis is three-fold: 1) re-modelling of local administration and transferring of the municipal Social Services responsibilities to the regional administration; 2) an increase of the population at risk of poverty and social exclusion 3) impact on social policies.
Resumo:
Efficient hill climbers have been recently proposed for single- and multi-objective pseudo-Boolean optimization problems. For $k$-bounded pseudo-Boolean functions where each variable appears in at most a constant number of subfunctions, it has been theoretically proven that the neighborhood of a solution can be explored in constant time. These hill climbers, combined with a high-level exploration strategy, have shown to improve state of the art methods in experimental studies and open the door to the so-called Gray Box Optimization, where part, but not all, of the details of the objective functions are used to better explore the search space. One important limitation of all the previous proposals is that they can only be applied to unconstrained pseudo-Boolean optimization problems. In this work, we address the constrained case for multi-objective $k$-bounded pseudo-Boolean optimization problems. We find that adding constraints to the pseudo-Boolean problem has a linear computational cost in the hill climber.
Resumo:
Several deterministic and probabilistic methods are used to evaluate the probability of seismically induced liquefaction of a soil. The probabilistic models usually possess some uncertainty in that model and uncertainties in the parameters used to develop that model. These model uncertainties vary from one statistical model to another. Most of the model uncertainties are epistemic, and can be addressed through appropriate knowledge of the statistical model. One such epistemic model uncertainty in evaluating liquefaction potential using a probabilistic model such as logistic regression is sampling bias. Sampling bias is the difference between the class distribution in the sample used for developing the statistical model and the true population distribution of liquefaction and non-liquefaction instances. Recent studies have shown that sampling bias can significantly affect the predicted probability using a statistical model. To address this epistemic uncertainty, a new approach was developed for evaluating the probability of seismically-induced soil liquefaction, in which a logistic regression model in combination with Hosmer-Lemeshow statistic was used. This approach was used to estimate the population (true) distribution of liquefaction to non-liquefaction instances of standard penetration test (SPT) and cone penetration test (CPT) based most updated case histories. Apart from this, other model uncertainties such as distribution of explanatory variables and significance of explanatory variables were also addressed using KS test and Wald statistic respectively. Moreover, based on estimated population distribution, logistic regression equations were proposed to calculate the probability of liquefaction for both SPT and CPT based case history. Additionally, the proposed probability curves were compared with existing probability curves based on SPT and CPT case histories.
Resumo:
The use of intriguing open-ended quick-write prompts within the Basotho science classroom could potentially provide a way for secondary teachers in Lesotho to have a time-efficient alternative to stimulate student thinking and increase critical thinking or application of scientific principles. Writing can be used as a powerful means to improve the achievement of students across many subject areas, including the sciences (Moore, 1993; Rivard, 1994; Rillero, Zambo, Cleland, and Ryan, 1996; Greenstein, 2013). This study focuses on the use of a non-traditional nor extensively studied writing method that could potentially support learning in science. A quasi-experimental research design, with a control and experimental group, was applied. The study was conducted at two schools, with one experimental classroom in one school and a second control group classroom in the second school for a period of 4 weeks. 51 Form B (US Grade 9 equivalent) students participated as the experimental group and 43 Form B students as the control group. In an effort to assess learning achievement, a 1 hour (35 mark) pre-test evaluation was made by and given to students by Basotho teachers at the beginning of this study to have an idea of student’s previous knowledge. Topics covered were Static Electricity, Current Electricity, Electromagnetic Waves, and Chemistry of Water. After the experimental trial period, an almost completely identical post-test evaluation was given to students in the same fashion to observe and compare gains in achievement. Test data was analyzed using an inferential statistics procedure that compared means and gains in knowledge made by the experimental and control groups. Difference between the gains of mean pre-test and post-test scores were statistically significant within each group, but were not statistically significant when the control and experimental groups were compared. Therefore, there was no clear practical effect. Qualitative data from teachers’ journals and students’ written feedback provides insight on the assessments, incorporation of the teaching method, and the development of participating students. Both mid and post-study student feedback shows that students had an overall positive and beneficial experience participating in this activity. Assessments and teacher journals showed areas of strength and weaknesses in student learning and on differences in teaching styles. They also helped support some feedback claims made by students. Areas of further research and improvement of the incorporation of this teaching method in the Basotho secondary science classroom are explored.
Resumo:
Tall buildings are wind-sensitive structures and could experience high wind-induced effects. Aerodynamic boundary layer wind tunnel testing has been the most commonly used method for estimating wind effects on tall buildings. Design wind effects on tall buildings are estimated through analytical processing of the data obtained from aerodynamic wind tunnel tests. Even though it is widely agreed that the data obtained from wind tunnel testing is fairly reliable the post-test analytical procedures are still argued to have remarkable uncertainties. This research work attempted to assess the uncertainties occurring at different stages of the post-test analytical procedures in detail and suggest improved techniques for reducing the uncertainties. Results of the study showed that traditionally used simplifying approximations, particularly in the frequency domain approach, could cause significant uncertainties in estimating aerodynamic wind-induced responses. Based on identified shortcomings, a more accurate dual aerodynamic data analysis framework which works in the frequency and time domains was developed. The comprehensive analysis framework allows estimating modal, resultant and peak values of various wind-induced responses of a tall building more accurately. Estimating design wind effects on tall buildings also requires synthesizing the wind tunnel data with local climatological data of the study site. A novel copula based approach was developed for accurately synthesizing aerodynamic and climatological data up on investigating the causes of significant uncertainties in currently used synthesizing techniques. Improvement of the new approach over the existing techniques was also illustrated with a case study on a 50 story building. At last, a practical dynamic optimization approach was suggested for tuning structural properties of tall buildings towards attaining optimum performance against wind loads with less number of design iterations.
Resumo:
This study aims to characterize the National Long-Term Care Network (NL-TCN) users. The Portuguese National Health Service, was restructured in 2006 with the creation of the National Long-Term Care Network to respond to new health and social needs concerning the continuity of care. Objectives- Analyse the sociodemographic profile of the network users and the review of hospital, local and regional management procedures. Methods-we used various methods of observational or experimental nature (data processing and presentation of results with the program Statistical Package for Social Sciences, version 20, descriptive statistics (frequencies, crosstabs and test chi-square)). The Pearson correlation test showed a positive correlation between time procedures at the local and regional management and hospital’s length of stay. Results- from a sample of 805 cases, 595 (74%) were admitted in the NL-TCN, a rate lower than the national average (86%). Almost half of the sample was admitted in Rehabilitation Units (46%), while nationally the highest number of admissions was in Home Care Teams (30%). The average time from hospital referral to network admission was 9.73 days with a positive correlation between referred network management procedures and hospital length of stay. Conclusions- For specialized units, the maximum waiting times were for the Long-Term and Support Units (mean 30.27 days) and the minimum waiting times were for Home Care Teams (mean 5.57 days). The average time between the local and regional management was 3.59 days. Almost 90% of referrals were orthopaedics, internal medicine and neurology and Network users were mostly elderly (average 75 years old), female and married. Most users were admitted to inpatient units (78%) and only 15% remained in their home town.