8 resultados para Penalty Clause

em Digital Commons at Florida International University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the proposition density, sentence and clause type usage and non-finite verbal usage in two college textbooks. The teaching implications are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Death qualification is a part of voir dire that is unique to capital trials. Unlike all other litigation, capital jurors must affirm their willingness to impose both legal standards (either life in prison or the death penalty). Jurors who assert they are able to do so are deemed “death-qualified” and are eligible for capital jury service: jurors who assert that they are unable to do so are deemed “excludable” or “scrupled” and are barred from hearing a death penalty case. During the penalty phase in capital trials, death-qualified jurors weigh the aggravators (i.e., arguments for death) against the mitigators (i.e., arguments for life) in order to determine the sentence. If the aggravating circumstances outweigh the mitigating circumstances, then the jury is to recommend death; if the mitigating circumstances outweigh the aggravating circumstances, then the jury is to recommend life. The jury is free to weigh each aggravating and mitigating circumstance in any matter they see fit. Previous research has found that death qualification impacts jurors' receptiveness to aggravating and mitigating circumstances (e.g., Luginbuhl & Middendorf, 1988). However, these studies utilized the now-defunct Witherspoon rule and did not include a case scenario for participants to reference. The purpose of this study was to investigate whether death qualification affects jurors' endorsements of aggravating and mitigating circumstances when Witt, rather than Witherspoon, is the legal standard for death qualification. Four hundred and fifty venirepersons from the 11 th Judicial Circuit in Miami, Florida completed a booklet of stimulus materials that contained the following: two death qualification questions; a case scenario that included a summary of the guilt and penalty phases of a capital case; a 26-item measure that required participants to endorse aggravators, nonstatutory mitigators, and statutory mitigators on a 6-point Likert scale; and standard demographic questions. Results indicated that death-qualified venirepersons, when compared to excludables, were more likely to endorse aggravating circumstances. Excludable participants, when compared to death-qualified venirepersons, were more likely to endorse nonstatutory mitigators. There was no significant difference between death-qualified and excludable venirepersons with respect to their endorsement of 6 out of 7 statutory mitigators. It would appear that the Furman v. Georgia (1972) decision to declare the death penalty unconstitutional is frustrated by the Lockhart v. McCree (1986) affirmation of death qualification. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polynomial phase modulated (PPM) signals have been shown to provide improved error rate performance with respect to conventional modulation formats under additive white Gaussian noise and fading channels in single-input single-output (SISO) communication systems. In this dissertation, systems with two and four transmit antennas using PPM signals were presented. In both cases we employed full-rate space-time block codes in order to take advantage of the multipath channel. For two transmit antennas, we used the orthogonal space-time block code (OSTBC) proposed by Alamouti and performed symbol-wise decoding by estimating the phase coefficients of the PPM signal using three different methods: maximum-likelihood (ML), sub-optimal ML (S-ML) and the high-order ambiguity function (HAF). In the case of four transmit antennas, we used the full-rate quasi-OSTBC (QOSTBC) proposed by Jafarkhani. However, in order to ensure the best error rate performance, PPM signals were selected such as to maximize the QOSTBC’s minimum coding gain distance (CGD). Since this method does not always provide a unique solution, an additional criterion known as maximum channel interference coefficient (CIC) was proposed. Through Monte Carlo simulations it was shown that by using QOSTBCs along with the properly selected PPM constellations based on the CGD and CIC criteria, full diversity in flat fading channels and thus, low BER at high signal-to-noise ratios (SNR) can be ensured. Lastly, the performance of symbol-wise decoding for QOSTBCs was evaluated. In this case a quasi zero-forcing method was used to decouple the received signal and it was shown that although this technique reduces the decoding complexity of the system, there is a penalty to be paid in terms of error rate performance at high SNRs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the discussion - The Nevada Gaming Debt Collection Experience - by Larry D. Strate, Assistant Professor, College of Business and Economics at the University of Nevada, Las Vegas, Assistant Professor Strate initially outlines the article by saying: “Even though Nevada has had over a century of legalized gaming experience, the evolution of gaming debt collection has been a recent phenomenon. The author traces that history and discusses implications of the current law.” The discussion opens with a comparison between the gaming industries of New Jersey/Atlantic City, and Las Vegas, Nevada. This contrast serves to point out the disparities in debt handling between the two. “There are major differences in the development of legalized gaming for both Nevada and Atlantic City. Nevada has had over a century of legalized gambling; Atlantic City, New Jersey, has completed a decade of its operation,” Strate informs you. “Nevada's gaming industry has been its primary economic base for many years; Atlantic City's entry into gaming served as a possible solution to a social problem. Nevada's processes of legalized gaming, credit play, and the collection of gaming debts were developed over a period of 125 years; Atlantic City's new industry began with gaming, gaming credit, and gaming debt collection simultaneously in 1976 [via the New Jersey Casino Control Act] .” The irony here is that Atlantic City, being the younger venue, had or has a better system for handling debt collection than do the historic and traditional Las Vegas properties. Many of these properties were duplicated in New Jersey, so the dichotomy existed whereby New Jersey casinos could recoup debt while their Nevada counterparts could not. “It would seem logical that a "territory" which permitted gambling in the early 1800’s would have allowed the Nevada industry to collect its debts as any other legal enterprise. But it did not,” Strate says. Of course, this situation could not be allowed to continue and Strate outlines the evolution. New Jersey tactfully benefitted from Nevada’s experience. “The fundamental change in gaming debt collection came through the legislature as the judicial decisions had declared gaming debts uncollectable by either a patron or a casino,” Strate informs you. “Nevada enacted its gaming debt collection act in 1983, six years after New Jersey,” Strate points out. One of the most noteworthy paragraphs in the entire article is this: “The fundamental change in 1983, and probably the most significant change in the history of gaming in Nevada since the enactment of the Open Gaming Law of 1931, was to allow non-restricted gaming licensees* to recover gaming debts evidenced by a credit instrument. The new law incorporated previously litigated terms with a new one, credit instrument.” The term is legally definable and gives Nevada courts an avenue of due process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. ^ For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver.^ The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. ^ The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present our approach to real-time service-oriented scheduling problems with the objective of maximizing the total system utility. Different from the traditional utility accrual scheduling problems that each task is associated with only a single time utility function (TUF), we associate two different TUFs—a profit TUF and a penalty TUF—with each task, to model the real-time services that not only need to reward the early completions but also need to penalize the abortions or deadline misses. The scheduling heuristics we proposed in this paper judiciously accept, schedule, and abort real-time services when necessary to maximize the accrued utility. Our extensive experimental results show that our proposed algorithms can significantly outperform the traditional scheduling algorithms such as the Earliest Deadline First (EDF), the traditional utility accrual (UA) scheduling algorithms, and an earlier scheduling approach based on a similar model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver. The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.