941 resultados para test case optimization
Resumo:
This paper addresses the current discussion on links between party politics and production regimes. Why do German Social Democrats opt for more corporate governance liberalization than the CDU although, in terms of the distributional outcomes of such reforms, one would expect the situation to be reversed? I divide my analysis into three stages. First, I use the European Parliament’s crucial vote on the European takeover directive in July 2001 as a test case to show that the left-right dimension does indeed matter in corporate governance reform, beside cross-class and cross-party nation-based interests. In a second step, by analyzing the party positions in the main German corporate governance reforms in the 1990s, I show that the SPD and the CDU behave “paradoxically” in the sense that the SPD favored more corporate governance liberalization than the CDU, which protected the institutions of “Rhenish,” “organized” capitalism. This constellation occurred in the discussions on company disclosure, management accountability, the power of banks, network dissolution, and takeover regulation. Third, I offer two explanations for this paradoxical party behavior. The first explanation concerns the historical conversion of ideas. I show that trade unions and Social Democrats favored a high degree of capital organization in the Weimar Republic, but this ideological position was driven in new directions at two watersheds: one in the late 1940s, the other in the late 1950s. My second explanation lies in the importance of conflicts over managerial control, in which both employees and minority shareholders oppose managers, and in which increased shareholder power strengthens the position of works councils.
Resumo:
When the new European Commission started work in autumn 2014, the president of the Commission took great pride in calling it a ‘political Commission’, which will be big on big things and small on small. Whilst the EU is currently dealing with many crises, reality is that things do not come much bigger than Nord Stream II. Will this be a political Commission that stands by its principles, including respect for liberty, democracy, the rule of law and human rights? Will this Commission have the backbone to politically assess a project that threatens EU unity and its core values, undermines the Union’s commonly agreed commitment to building an Energy Union and facilitates Russia’s aggression against Ukraine? President Juncker’s controversial visit to Russia and meeting with President Putin on 16-17 June is a test-case: will this Commission be ready to defend its commitments and principles when discussing ‘economic issues’?
Resumo:
A mathematical model for long-term, three-dimensional shoreline evolution is developed. The combined effects of variations of sea level; wave refraction and diffraction; loss of sand by density currents during storms, by rip currents, and by wind; bluff erosion and berm accretion; effects of manmade structures such as long groin or navigational structures; and beach nourishment are all taken into account. A computer program is developed with various subroutines which permit modification as the state-of-the-art progresses. The program is applied to a test case at Holland Harbor, Michigan. (Author).
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-03
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The testing of concurrent software components can be difficult due to the inherent non-determinism present in these components. For example, if the same test case is run multiple times, it may produce different results. This non-determinism may lead to problems with determining expected outputs. In this paper, we present and discuss several possible solutions to this problem in the context of testing concurrent Java components using the ConAn testing tool. We then present a recent extension to the tool that provides a general solution to this problem that is sufficient to deal with the level of non-determinism that we have encountered in testing over 20 components with ConAn. © 2005 IEEE
Resumo:
This thesis concerns mixed flows (which are characterized by the simultaneous occurrence of free-surface and pressurized flow in sewers, tunnels, culverts or under bridges), and contributes to the improvement of the existing numerical tools for modelling these phenomena. The classic Preissmann slot approach is selected due to its simplicity and capability of predicting results comparable to those of a more recent and complex two-equation model, as shown here with reference to a laboratory test case. In order to enhance the computational efficiency, a local time stepping strategy is implemented in a shock-capturing Godunov-type finite volume numerical scheme for the integration of the de Saint-Venant equations. The results of different numerical tests show that local time stepping reduces run time significantly (between −29% and −85% CPU time for the test cases considered) compared to the conventional global time stepping, especially when only a small region of the flow field is surcharged, while solution accuracy and mass conservation are not impaired. The second part of this thesis is devoted to the modelling of the hydraulic effects of potentially pressurized structures, such as bridges and culverts, inserted in open channel domains. To this aim, a two-dimensional mixed flow model is developed first. The classic conservative formulation of the 2D shallow water equations for free-surface flow is adapted by assuming that two fictitious vertical slots, normally intersecting, are added on the ceiling of each integration element. Numerical results show that this schematization is suitable for the prediction of 2D flooding phenomena in which the pressurization of crossing structures can be expected. Given that the Preissmann model does not allow for the possibility of bridge overtopping, a one-dimensional model is also presented in this thesis to handle this particular condition. The flows below and above the deck are considered as parallel, and linked to the upstream and downstream reaches of the channel by introducing suitable internal boundary conditions. The comparison with experimental data and with the results of HEC-RAS simulations shows that the proposed model can be a useful and effective tool for predicting overtopping and backwater effects induced by the presence of bridges and culverts.
Resumo:
The global market has become increasingly dynamic, unpredictable and customer-driven. This has led to rising rates of new product introduction and turbulent demand patterns across product mixes. As a result, manufacturing enterprises were facing mounting challenges to be agile and responsive to cope with market changes, so as to achieve the competitiveness of producing and delivering products to the market timely and cost-effectively. This paper introduces a currency-based iterative agent bidding mechanism to effectively and cost-efficiently integrate the activities associated with production planning and control, so as to achieve an optimised process plan and schedule. The aim is to enhance the agility of manufacturing systems to accommodate dynamic changes in the market and production. The iterative bidding mechanism is executed based on currency-like metrics; each operation to be performed is assigned with a virtual currency value and agents bid for the operation if they make a virtual profit based on this value. These currency values are optimised iteratively and so does the bidding process based on new sets of values. This is aimed at obtaining better and better production plans, leading to near-optimality. A genetic algorithm is proposed to optimise the currency values at each iteration. In this paper, the implementation of the mechanism and the test case simulation results are also discussed. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
In today's market, the global competition has put manufacturing businesses in great pressures to respond rapidly to dynamic variations in demand patterns across products and changing product mixes. To achieve substantial responsiveness, the manufacturing activities associated with production planning and control must be integrated dynamically, efficiently and cost-effectively. This paper presents an iterative agent bidding mechanism, which performs dynamic integration of process planning and production scheduling to generate optimised process plans and schedules in response to dynamic changes in the market and production environment. The iterative bidding procedure is carried out based on currency-like metrics in which all operations (e.g. machining processes) to be performed are assigned with virtual currency values, and resource agents bid for the operations if the costs incurred for performing them are lower than the currency values. The currency values are adjusted iteratively and resource agents re-bid for the operations based on the new set of currency values until the total production cost is minimised. A simulated annealing optimisation technique is employed to optimise the currency values iteratively. The feasibility of the proposed methodology has been validated using a test case and results obtained have proven the method outperforming non-agent-based methods.
Resumo:
One of the main challenges of classifying clinical data is determining how to handle missing features. Most research favours imputing of missing values or neglecting records that include missing data, both of which can degrade accuracy when missing values exceed a certain level. In this research we propose a methodology to handle data sets with a large percentage of missing values and with high variability in which particular data are missing. Feature selection is effected by picking variables sequentially in order of maximum correlation with the dependent variable and minimum correlation with variables already selected. Classification models are generated individually for each test case based on its particular feature set and the matching data values available in the training population. The method was applied to real patients' anonymous mental-health data where the task was to predict the suicide risk judgement clinicians would give for each patient's data, with eleven possible outcome classes: zero to ten, representing no risk to maximum risk. The results compare favourably with alternative methods and have the advantage of ensuring explanations of risk are based only on the data given, not imputed data. This is important for clinical decision support systems using human expertise for modelling and explaining predictions.
Resumo:
E-learning is supposing an innovation in teaching, raising from the development of new technologies. It is based in a set of educational resources, including, among others, multimedia or interactive contents accessible through Internet or Intranet networks. A whole spectrum of tools and services support e-learning, some of them include auto-evaluation and automated correction of test-like exercises, however, this sort of exercises are very constrained because of its nature: fixed contents and correct answers suppose a limit in the way teachers may evaluation students. In this paper we propose a new engine that allows validating complex exercises in the area of Data Structures and Algorithms. Correct solutions to exercises do not rely only in how good the execution of the code is, or if the results are same as expected. A set of criteria on algorithm complexity or correctness in the use of the data structures are required. The engine presented in this work covers a wide set of exercises with these characteristics allowing teachers to establish the set of requirements for a solution, and students to obtain a measure on the quality of their solution in the same terms that are later required for exams.
Resumo:
In this dissertation, I offer a pedagogical proposal for learning the Christian Scriptures guided by respect for the nature of the reader and the integrity of the biblical text. Christian educators have profitably developed recent theoretical interest in the body’s role in human meaning with regard to worship and praxis methodologies, but the implications of this research for communal study of the biblical text merit further development. I make the case for adopting scriptural imagination as the goal of pedagogically constructed encounters with the Christian Scriptures. The argument proceeds through a series of questions addressing both sides of the text/reader encounter.
Chapter one considers the question “what is the nature of the reader and, subsequently, the shape of the reader’s ways of knowing?” This investigation into recent literature on the body’s involvement in human knowing includes related epistemological shifts with Christian education. On the basis of this survey, imagination emerges as a compelling designator of an incorporative, constructive creaturely capacity that gives rise to a way of being in the world. Teachers of Scripture who intend to participate in Christian formation should account for the imagination’s centrality for all knowing. After briefly situating this proposal within a theological account of creatureliness, I make the initial case for Scriptural imagination as a pedagogical aim.
Imagination as creaturely capacity addresses the first guiding value, but does this proposal also respect the integrity and nature of the biblical text, and specifically of biblical narratives? In response, in chapter two I take up the Acts of the Apostles as a potential test case and exemplar for the dynamics pertinent to the formation of imagination. Drawing on secondary literature on the genre and literary features of Acts, I conclude that Acts coheres with this project’s explicit interest in imagination as a central component of the process of Christian formation in relationship to the Scriptures.
Chapters three and four each take up a pericope from Acts to assess whether the theoretical perspectives developed in prior chapters generate any interpretive payoff. In each of these chapters, a particular story within Acts functions as a test case for readings of biblical narratives guided by a concern for scriptural imagination. Each of these chapters begins with further theoretical development of some element of imaginal formation. Chapter three provides a theoretical account of practices as they relate to imagination, bringing that theory into conversation with Peter’s engagement in hospitality practices with Cornelius in Acts 10:1-11:18. Chapter four discusses the formative power of narratives, with implications for the analysis of Paul’s shipwreck in Acts 27:1-28:16.
In the final chapter, I offer a two-part constructive pedagogical proposal for reading scriptural narratives in Christian communities. First, I suggest adopting resonance above relevance as the goal of pedagogically constructed encounters with the Scriptures. Second, I offer three ways of reading with the body, including the physical, ecclesial, and social bodies that shape all learning. I conclude by identifying the importance of scriptural imagination for Christian formation and witness in the twenty-first century.
Resumo:
This study examines the impact of ambient temperature on emotional well-being in the U.S. population aged 18+. The U.S. is an interesting test case because of its resources, technology and variation in climate across different areas, which also allows us to examine whether adaptation to different climates could weaken or even eliminate the impact of heat on well-being. Using survey responses from 1.9 million Americans over the period from 2008 to 2013, we estimate the effect of temperature on well-being from exogenous day-to-day temperature variation within respondents’ area of residence and test whether this effect varies across areas with different climates. We find that increasing temperatures significantly reduce well-being. Compared to average daily temperatures in the 50–60 °F (10–16 °C) range, temperatures above 70 °F (21 °C) reduce positive emotions (e.g. joy, happiness), increase negative emotions (e.g. stress, anger), and increase fatigue (feeling tired, low energy). These effects are particularly strong among less educated and older Americans. However, there is no consistent evidence that heat effects on well-being differ across areas with mild and hot summers, suggesting limited variation in heat adaptation.
Resumo:
Steady-state computational fluid dynamics (CFD) simulations are an essential tool in the design process of centrifugal compressors. Whilst global parameters, such as pressure ratio and efficiency, can be predicted with reasonable accuracy, the accurate prediction of detailed compressor flow fields is a much more significant challenge. Much of the inaccuracy is associated with the incorrect selection of turbulence model. The need for a quick turnaround in simulations during the design optimisation process, also demands that the turbulence model selected be robust and numerically stable with short simulation times.
In order to assess the accuracy of a number of turbulence model predictions, the current study used an exemplar open CFD test case, the centrifugal compressor ‘Radiver’, to compare the results of three eddy viscosity models and two Reynolds stress type models. The turbulence models investigated in this study were (i) Spalart-Allmaras (SA) model, (ii) the Shear Stress Transport (SST) model, (iii) a modification to the SST model denoted the SST-curvature correction (SST-CC), (iv) Reynolds stress model of Speziale, Sarkar and Gatski (RSM-SSG), and (v) the turbulence frequency formulated Reynolds stress model (RSM-ω). Each was found to be in good agreement with the experiments (below 2% discrepancy), with respect to total-to-total parameters at three different operating conditions. However, for the off-design conditions, local flow field differences were observed between the models, with the SA model showing particularly poor prediction of local flow structures. The SST-CC showed better prediction of curved rotating flows in the impeller. The RSM-ω was better for the wake and separated flow in the diffuser. The SST model showed reasonably stable, robust and time efficient capability to predict global and local flow features.
Resumo:
The aim of this thesis was to investigate, using the real-time test case of the 2014 Commonwealth Games, whether the realist synthesis methodology could contribute to the making of health policy in a meaningful way. This was done by looking at two distinct research questions: first, whether realist synthesis could contribute new insights to the health policymaking process, and second, whether the 2014 Commonwealth Games volunteer programme was likely to have any significant, measurable, impact on health inequalities experienced by large sections of the host population. The 2014 Commonwealth Games legacy laid out ambitious plans for the event, in which it was anticipated that it would provide explicit opportunities to impact positively on health inequalities. By using realist synthesis to unpick the theories underpinning the volunteer programme, the review identifies the population subgroups for whom the programme was likely to be successful, how this could be achieved and in what contexts. In answer to the first research question, the review found that while realist methods were able to provide a more nuanced exposition of the impacts of the Games volunteer programme on health inequalities than previous traditional reviews had been able to provide, there were several drawbacks to using the method. It was found to be resource-intensive and complex, encouraging the exploration of a much wider set of literatures at the expense of an in-depth grasp of the complexities of those literatures. In answer to the second research question, the review found that the Games were, if anything, likely to exacerbate health inequalities because the programme was designed in such a way that individuals recruited to it were most likely to be those in least need of the additional mental and physical health benefits that Games volunteering was designed to provide. The following thesis details the approach taken to investigate both the realist approach to evidence synthesis and the likelihood that the 2014 Games volunteer programme would yield the expected results.