5 resultados para Solving Problems for Evidence
em Digital Commons at Florida International University
Resumo:
This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.
Resumo:
Satisfiability, implication and equivalence problems are important and widely-encountered database problems that need to be efficiently and effectively solved. We provide a comprehensive and systematic study of these problems. We consider three popular types of arithmetic inequalities, (X op C), (X op Y), and (X op Y + C), where X and Y are attributes, C is a constant of the domain of X, and op $\in\ \{{<},\ {\le},\ {=},\ {\not=},\ {>},\ {\ge}\}.$ These inequalities are most frequently used in a database system, since the first type of inequalities represents $\theta$-join, the second type represents selection, and the third type is popular in deductive databases. We study the problems under the integer domain and the real domain, as well as under two different operator sets.^ Our results show that solutions under different domains and/or different operator sets are quite different. In this dissertation, we either report the first necessary and sufficient conditions as well as their efficient algorithms with complexity analysis, or provide improved algorithms. ^
Resumo:
Ellipsometry is a well known optical technique used for the characterization of reflective surfaces in study and films between two media. It is based on measuring the change in the state of polarization that occurs as a beam of polarized light is reflected from or transmitted through the film. Measuring this change can be used to calculate parameters of a single layer film such as the thickness and the refractive index. However, extracting these parameters of interest requires significant numerical processing due to the noninvertible equations. Typically, this is done using least squares solving methods which are slow and adversely affected by local minima in the solvable surface. This thesis describes the development and implementation of a new technique using only Artificial Neural Networks (ANN) to calculate thin film parameters. The new method offers a speed in the orders of magnitude faster than preceding methods and convergence to local minima is completely eliminated.
Resumo:
This thesis extends previous research on critical decision making and problem-solving by refining and validating a self-report measure designed to assess the use of critical decision making and problem solving in making life choices. The analysis was conducted by performing two studies, and therefore collecting two sets of data on the psychometric properties of the measure. Psychometric analyses included: item analysis, internal consistency reliability, interrater reliability, and an exploratory factor analysis. This study also included regression analysis with the Wonderlic, an established measure of general intelligence, to provide preliminary evidence for the construct validity of the measure.
Resumo:
Police investigators rely heavily on eliciting confessions from suspects to solve crimes and prosecute offenders. Therefore, it is essential to develop evidence-based interrogation techniques that will motivate guilty suspects to confess but minimize false confessions from the innocent. Currently, there is little scientific support for specific interrogation techniques that may increase true confessions and decrease false confessions. Rapport building is a promising possibility. Despite its recommendation in police interrogation guidelines, there is no scientific evidence showing the effect of rapport building in police interrogations. The current study examined, experimentally, whether using rapport as an interrogation technique would influence participants’ decisions to confess to a wrongdoing. It was hypothesized that building rapport with participants would lead to more true confessions and fewer false confessions than not building rapport. One hundred and sixty nine undergraduates participated in the study. Participants worked on logic problems together and individually, with a study confederate. The confederate asked half of the participants for help in one of the individual problems – effectively breaking the rules of the study. After working on these problems, a research assistant playing the role of interviewer came into the room, built rapport or not with participants, accused all participants of cheating by sharing answers on the individual problems, and asked them to sign a statement admitting their guilt. Results indicated that guilty participants were more likely to sign the confession statement than innocent participants. However, there were no significant differences on participants’ confession decisions based on the level of rapport they experienced. Results do not provide support for the hypothesis that building rapport increases the likelihood of obtaining true confessions and decreases the likelihood of obtaining false confessions. These findings suggest that, despite the overwhelming recommendation for the use of rapport with suspects, its actual implementation may not have a direct impact on the outcome of interrogations.