564 resultados para PENALTY KICKING


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A feature of many penal codes is that punishments are more severe for repeat offenders, yet economic models have had a hard time providing a theoretical justification for this practice. This paper offers an explanation based on the wage penalty suffered by individuals convicted of crime. While this penalty probably deters some first-timers from committing crimes, it actually hampers deterrence of repeat offenders because of their diminished employments opportunities. We show that in this setting, an escalating penalty scheme is optimal and time consistent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since its introduction into the United States in the 1980s, crack cocaine has been a harsh epidemic that has taken its toll on a countless number of people. This highly addictive, cheap and readily available drug of abuse has permeated many demographic sectors, mostly in low income, lesser educated, and urban communities. This epidemic of crack cocaine use in inner city areas across the Unites States has been described as an expression of economic marginality and “social suffering” coupled with the local and international forces of drug market economies (Agar 2003). As crack cocaine is a derivative of cocaine, it utilizes the psychoactive component of the drug, but delivers it in a much stronger, quicker, and more addictive fashion. This, coupled with its ready availability and cheap price has allowed for users to not only become very addicted very quickly, but to be subject to the stringent and sometimes unequal or inconsistent punishments for possession and distribution of crack-cocaine. ^ There are many public health and social ramifications from the abuse of crack-cocaine, and these epidemics appear to target low income and minority groups. Public health issues relating to the physical, mental, and economic strain will be addressed, as well as the direct and indirect effects of the punishments that come as a result of the disparity in penalties for cocaine and crack-cocaine possession and distribution. ^ Three new policies have recently been introduced into the United Stated Congress that actively address the disparity in sentencing for drug and criminal activities. They are, (1) Powder-Crack Cocaine Penalty Equalization Act of 2009, (HR 18, 111th Cong. 2009), (2) The Drug Sentencing Reform and Cocaine Kingpin Trafficking Act of 2009, (HR 265, 111th Cong. 2009) and (3) The Justice Integrity Act of 2009, (111th Cong. 2009). ^ Although they have only been initiated, if passed, they have potential to not only eliminate the crack-cocaine disparity, but to enact laws that help those affected by this epidemic. The final and overarching goal of this paper is to analyze and ultimately choose the ideal policy that would not only eliminate the cocaine and crack disparity regardless of current or future state statutes, but will provide the best method of rehabilitation, prevention, and justice. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los delitos sexuales entre la segunda mitad del siglo XIX y principios del XX presentan problemas particulares en relación a la posibilidad de su imputación y castigo vinculados a la forma particular como se conciben estos crímenes. Propongo, a partir del análisis de expedientes judiciales, problematizar algunas de estas cuestiones partiendo desde el análisis de su concepción como "dependientes de iniciativa privada" y por tanto la necesidad de una "acusación particular" que implicó largos debates en los tribunales ya que la interpretación de esta situación tuvo más de una lectura posible: quedaban satisfechos los requerimientos de la ley con la denuncia o debía continuarse la participación de la parte hasta la misma sentencia, es decir, llegar hasta la vista de acusación? En este contexto se pone entonces en discusión la participación del Agente Fiscal en el proceso en tanto acusador y por lo tanto el interés del Estado como tal en la persecución de estos crímenes. Busco, simultáneamente mostrar las discusiones que se dieron en la práctica penal, que reflejan distintas formas de interpretar y pensar estos delitos; las razones implícitas en su persecución y castigo o por el contrario de su finalización y falta de pena

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research presented in this article aims to identify and to analyze the cases of environmental conflicts in the microrregião de Viçosa, Minas Gerais, Brasil. For this were performed data surveys were conducted in the archives of the Ministério Público regarding the municipalities of microrregião in study and a workshop with various social movements of the mesorregião da Zona da Mata. From this survey on we check the conflict established between environmental legislation and farm workers, and also the concentration of the penalty on small cases of violations of environmental legislation. Thus, we conclude that to think about environmental conflicts is require the recognition of inequalities of power and the different types of knowledge and rationalities involved in the appropriation of natural resources by society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research presented in this article aims to identify and to analyze the cases of environmental conflicts in the microrregião de Viçosa, Minas Gerais, Brasil. For this were performed data surveys were conducted in the archives of the Ministério Público regarding the municipalities of microrregião in study and a workshop with various social movements of the mesorregião da Zona da Mata. From this survey on we check the conflict established between environmental legislation and farm workers, and also the concentration of the penalty on small cases of violations of environmental legislation. Thus, we conclude that to think about environmental conflicts is require the recognition of inequalities of power and the different types of knowledge and rationalities involved in the appropriation of natural resources by society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los delitos sexuales entre la segunda mitad del siglo XIX y principios del XX presentan problemas particulares en relación a la posibilidad de su imputación y castigo vinculados a la forma particular como se conciben estos crímenes. Propongo, a partir del análisis de expedientes judiciales, problematizar algunas de estas cuestiones partiendo desde el análisis de su concepción como "dependientes de iniciativa privada" y por tanto la necesidad de una "acusación particular" que implicó largos debates en los tribunales ya que la interpretación de esta situación tuvo más de una lectura posible: quedaban satisfechos los requerimientos de la ley con la denuncia o debía continuarse la participación de la parte hasta la misma sentencia, es decir, llegar hasta la vista de acusación? En este contexto se pone entonces en discusión la participación del Agente Fiscal en el proceso en tanto acusador y por lo tanto el interés del Estado como tal en la persecución de estos crímenes. Busco, simultáneamente mostrar las discusiones que se dieron en la práctica penal, que reflejan distintas formas de interpretar y pensar estos delitos; las razones implícitas en su persecución y castigo o por el contrario de su finalización y falta de pena

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The climatic conditions of mountain habitats are greatly influenced by topography. Large differences in microclimate occur with small changes in elevation, and this complex interaction is an important determinant of mountain plant distributions. In spite of this, elevation is not often considered as a relevant predictor in species distribution models (SDMs) for mountain plants. Here, we evaluated the importance of including elevation as a predictor in SDMs for mountain plant species. We generated two sets of SDMs for each of 73 plant species that occur in the Pacific Northwest of North America; one set of models included elevation as a predictor variable and the other set did not. AUC scores indicated that omitting elevation as a predictor resulted in a negligible reduction of model performance. However, further analysis revealed that the omission of elevation resulted in large over-predictions of species' niche breadths-this effect was most pronounced for species that occupy the highest elevations. In addition, the inclusion of elevation as a predictor constrained the effects of other predictors that superficially affected the outcome of the models generated without elevation. Our results demonstrate that the inclusion of elevation as a predictor variable improves the quality of SDMs for high-elevation plant species. Because of the negligible AUC score penalty for over-predicting niche breadth, our results support the notion that AUC scores alone should not be used as a measure of model quality. More generally, our results illustrate the importance of selecting biologically relevant predictor variables when constructing SDMs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los delitos sexuales entre la segunda mitad del siglo XIX y principios del XX presentan problemas particulares en relación a la posibilidad de su imputación y castigo vinculados a la forma particular como se conciben estos crímenes. Propongo, a partir del análisis de expedientes judiciales, problematizar algunas de estas cuestiones partiendo desde el análisis de su concepción como "dependientes de iniciativa privada" y por tanto la necesidad de una "acusación particular" que implicó largos debates en los tribunales ya que la interpretación de esta situación tuvo más de una lectura posible: quedaban satisfechos los requerimientos de la ley con la denuncia o debía continuarse la participación de la parte hasta la misma sentencia, es decir, llegar hasta la vista de acusación? En este contexto se pone entonces en discusión la participación del Agente Fiscal en el proceso en tanto acusador y por lo tanto el interés del Estado como tal en la persecución de estos crímenes. Busco, simultáneamente mostrar las discusiones que se dieron en la práctica penal, que reflejan distintas formas de interpretar y pensar estos delitos; las razones implícitas en su persecución y castigo o por el contrario de su finalización y falta de pena

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research presented in this article aims to identify and to analyze the cases of environmental conflicts in the microrregião de Viçosa, Minas Gerais, Brasil. For this were performed data surveys were conducted in the archives of the Ministério Público regarding the municipalities of microrregião in study and a workshop with various social movements of the mesorregião da Zona da Mata. From this survey on we check the conflict established between environmental legislation and farm workers, and also the concentration of the penalty on small cases of violations of environmental legislation. Thus, we conclude that to think about environmental conflicts is require the recognition of inequalities of power and the different types of knowledge and rationalities involved in the appropriation of natural resources by society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The naïve Bayes approach is a simple but often satisfactory method for supervised classification. In this paper, we focus on the naïve Bayes model and propose the application of regularization techniques to learn a naïve Bayes classifier. The main contribution of the paper is a stagewise version of the selective naïve Bayes, which can be considered a regularized version of the naïve Bayes model. We call it forward stagewise naïve Bayes. For comparison’s sake, we also introduce an explicitly regularized formulation of the naïve Bayes model, where conditional independence (absence of arcs) is promoted via an L 1/L 2-group penalty on the parameters that define the conditional probability distributions. Although already published in the literature, this idea has only been applied for continuous predictors. We extend this formulation to discrete predictors and propose a modification that yields an adaptive penalization. We show that, whereas the L 1/L 2 group penalty formulation only discards irrelevant predictors, the forward stagewise naïve Bayes can discard both irrelevant and redundant predictors, which are known to be harmful for the naïve Bayes classifier. Both approaches, however, usually improve the classical naïve Bayes model’s accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel framework for encoding latency analysis of arbitrary multiview video coding prediction structures. This framework avoids the need to consider an specific encoder architecture for encoding latency analysis by assuming an unlimited processing capacity on the multiview encoder. Under this assumption, only the influence of the prediction structure and the processing times have to be considered, and the encoding latency is solved systematically by means of a graph model. The results obtained with this model are valid for a multiview encoder with sufficient processing capacity and serve as a lower bound otherwise. Furthermore, with the objective of low latency encoder design with low penalty on rate-distortion performance, the graph model allows us to identify the prediction relationships that add higher encoding latency to the encoder. Experimental results for JMVM prediction structures illustrate how low latency prediction structures with a low rate-distortion penalty can be derived in a systematic manner using the new model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study, through a concrete case, the feasibility of using a high-level, general-purpose logic language in the design and implementation of applications targeting wearable computers. The case study is a "sound spatializer" which, given real-time signáis for monaural audio and heading, generates stereo sound which appears to come from a position in space. The use of advanced compile-time transformations and optimizations made it possible to execute code written in a clear style without efñciency or architectural concerns on the target device, while meeting strict existing time and memory constraints. The final executable compares favorably with a similar implementation written in C. We believe that this case is representative of a wider class of common pervasive computing applications, and that the techniques we show here can be put to good use in a range of scenarios. This points to the possibility of applying high-level languages, with their associated flexibility, conciseness, ability to be automatically parallelized, sophisticated compile-time tools for analysis and verification, etc., to the embedded systems field without paying an unnecessary performance penalty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La evaluación de la seguridad de estructuras antiguas de fábrica es un problema abierto.El material es heterogéneo y anisótropo, el estado previo de tensiones difícil de conocer y las condiciones de contorno inciertas. A comienzos de los años 50 se demostró que el análisis límite era aplicable a este tipo de estructuras, considerándose desde entonces como una herramienta adecuada. En los casos en los que no se produce deslizamiento la aplicación de los teoremas del análisis límite estándar constituye una herramienta formidable por su simplicidad y robustez. No es necesario conocer el estado real de tensiones. Basta con encontrar cualquier solución de equilibrio, y que satisfaga las condiciones de límite del material, en la seguridad de que su carga será igual o inferior a la carga real de inicio de colapso. Además esta carga de inicio de colapso es única (teorema de la unicidad) y se puede obtener como el óptimo de uno cualquiera entre un par de programas matemáticos convexos duales. Sin embargo, cuando puedan existir mecanismos de inicio de colapso que impliquen deslizamientos, cualquier solución debe satisfacer tanto las restricciones estáticas como las cinemáticas, así como un tipo especial de restricciones disyuntivas que ligan las anteriores y que pueden plantearse como de complementariedad. En este último caso no está asegurada la existencia de una solución única, por lo que es necesaria la búsqueda de otros métodos para tratar la incertidumbre asociada a su multiplicidad. En los últimos años, la investigación se ha centrado en la búsqueda de un mínimo absoluto por debajo del cual el colapso sea imposible. Este método es fácil de plantear desde el punto de vista matemático, pero intratable computacionalmente, debido a las restricciones de complementariedad 0 y z 0 que no son ni convexas ni suaves. El problema de decisión resultante es de complejidad computacional No determinista Polinomial (NP)- completo y el problema de optimización global NP-difícil. A pesar de ello, obtener una solución (sin garantía de exito) es un problema asequible. La presente tesis propone resolver el problema mediante Programación Lineal Secuencial, aprovechando las especiales características de las restricciones de complementariedad, que escritas en forma bilineal son del tipo y z = 0; y 0; z 0 , y aprovechando que el error de complementariedad (en forma bilineal) es una función de penalización exacta. Pero cuando se trata de encontrar la peor solución, el problema de optimización global equivalente es intratable (NP-difícil). Además, en tanto no se demuestre la existencia de un principio de máximo o mínimo, existe la duda de que el esfuerzo empleado en aproximar este mínimo esté justificado. En el capítulo 5, se propone hallar la distribución de frecuencias del factor de carga, para todas las soluciones de inicio de colapso posibles, sobre un sencillo ejemplo. Para ello, se realiza un muestreo de soluciones mediante el método de Monte Carlo, utilizando como contraste un método exacto de computación de politopos. El objetivo final es plantear hasta que punto está justificada la busqueda del mínimo absoluto y proponer un método alternativo de evaluación de la seguridad basado en probabilidades. Las distribuciones de frecuencias, de los factores de carga correspondientes a las soluciones de inicio de colapso obtenidas para el caso estudiado, muestran que tanto el valor máximo como el mínimo de los factores de carga son muy infrecuentes, y tanto más, cuanto más perfecto y contínuo es el contacto. Los resultados obtenidos confirman el interés de desarrollar nuevos métodos probabilistas. En el capítulo 6, se propone un método de este tipo basado en la obtención de múltiples soluciones, desde puntos de partida aleatorios y calificando los resultados mediante la Estadística de Orden. El propósito es determinar la probabilidad de inicio de colapso para cada solución.El método se aplica (de acuerdo a la reducción de expectativas propuesta por la Optimización Ordinal) para obtener una solución que se encuentre en un porcentaje determinado de las peores. Finalmente, en el capítulo 7, se proponen métodos híbridos, incorporando metaheurísticas, para los casos en que la búsqueda del mínimo global esté justificada. Abstract Safety assessment of the historic masonry structures is an open problem. The material is heterogeneous and anisotropic, the previous state of stress is hard to know and the boundary conditions are uncertain. In the early 50's it was proven that limit analysis was applicable to this kind of structures, being considered a suitable tool since then. In cases where no slip occurs, the application of the standard limit analysis theorems constitutes an excellent tool due to its simplicity and robustness. It is enough find any equilibrium solution which satisfy the limit constraints of the material. As we are certain that this load will be equal to or less than the actual load of the onset of collapse, it is not necessary to know the actual stresses state. Furthermore this load for the onset of collapse is unique (uniqueness theorem), and it can be obtained as the optimal from any of two mathematical convex duals programs However, if the mechanisms of the onset of collapse involve sliding, any solution must satisfy both static and kinematic constraints, and also a special kind of disjunctive constraints linking the previous ones, which can be formulated as complementarity constraints. In the latter case, it is not guaranted the existence of a single solution, so it is necessary to look for other ways to treat the uncertainty associated with its multiplicity. In recent years, research has been focused on finding an absolute minimum below which collapse is impossible. This method is easy to set from a mathematical point of view, but computationally intractable. This is due to the complementarity constraints 0 y z 0 , which are neither convex nor smooth. The computational complexity of the resulting decision problem is "Not-deterministic Polynomialcomplete" (NP-complete), and the corresponding global optimization problem is NP-hard. However, obtaining a solution (success is not guaranteed) is an affordable problem. This thesis proposes solve that problem through Successive Linear Programming: taking advantage of the special characteristics of complementarity constraints, which written in bilinear form are y z = 0; y 0; z 0 ; and taking advantage of the fact that the complementarity error (bilinear form) is an exact penalty function. But when it comes to finding the worst solution, the (equivalent) global optimization problem is intractable (NP-hard). Furthermore, until a minimum or maximum principle is not demonstrated, it is questionable that the effort expended in approximating this minimum is justified. XIV In chapter 5, it is proposed find the frequency distribution of the load factor, for all possible solutions of the onset of collapse, on a simple example. For this purpose, a Monte Carlo sampling of solutions is performed using a contrast method "exact computation of polytopes". The ultimate goal is to determine to which extent the search of the global minimum is justified, and to propose an alternative approach to safety assessment based on probabilities. The frequency distributions for the case study show that both the maximum and the minimum load factors are very infrequent, especially when the contact gets more perfect and more continuous. The results indicates the interest of developing new probabilistic methods. In Chapter 6, is proposed a method based on multiple solutions obtained from random starting points, and qualifying the results through Order Statistics. The purpose is to determine the probability for each solution of the onset of collapse. The method is applied (according to expectations reduction given by the Ordinal Optimization) to obtain a solution that is in a certain percentage of the worst. Finally, in Chapter 7, hybrid methods incorporating metaheuristics are proposed for cases in which the search for the global minimum is justified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta tesis está enmarcada en el estudio de diferentes procedimientos numéricos para resolver la dinámica de un sistema multicuerpo sometido a restricciones e impacto, que puede estar compuesto por sólidos rígidos y deformables conectados entre sí por diversos tipos de uniones. Dentro de los métodos numéricos analizados se presta un especial interés a los métodos consistentes, los cuales tienen por objetivo que la energía calculada en cada paso de tiempo, para un sistema mecánico, tenga una evolución coherente con el comportamiento teórico de la energía. En otras palabras, un método consistente mantiene constante la energía total en un problema conservativo, y en presencia de fuerzas disipativas proporciona un decremento positivo de la energía total. En esta línea se desarrolla un algoritmo numérico consistente con la energía total para resolver las ecuaciones de la dinámica de un sistema multicuerpo. Como parte de este algoritmo se formulan energéticamente consistentes las restricciones y el contacto empleando multiplicadores de Lagrange, penalización y Lagrange aumentado. Se propone también un método para el contacto con sólidos rígidos representados mediante superficies implícitas, basado en una restricción regularizada que se adaptada adecuadamente para el cumplimiento exacto de la restricción de contacto y para ser consistente con la conservación de la energía total. En este contexto se estudian dos enfoques: uno para el contacto elástico puro (sin deformación) formulado con penalización y Lagrange aumentado; y otro basado en un modelo constitutivo para el contacto con penetración. En el segundo enfoque se usa un potencial de penalización que, en ausencia de componentes disipativas, restaura la energía almacenada en el contacto y disipa energía de forma consistente con el modelo continuo cuando las componentes de amortiguamiento y fricción son consideradas. This thesis focuses on the study of several numerical procedures used to solve the dynamics of a multibody system subjected to constraints and impact. The system may be composed by rigid and deformable bodies connected by different types of joints. Within this framework, special attention is paid to consistent methods, which preserve the theoretical behavior of the energy at each time step. In other words, a consistent method keeps the total energy constant in a conservative problem, and provides a positive decrease in the total energy when dissipative forces are present. A numerical algorithm has been developed for solving the dynamical equations of multibody systems, which is energetically consistent. Energetic consistency in contacts and constraints is formulated using Lagrange multipliers, penalty and augmented Lagrange methods. A contact methodology is proposed for rigid bodies with a boundary represented by implicit surfaces. The method is based on a suitable regularized constraint formulation, adapted both to fulfill exactly the contact constraint, and to be consistent with the conservation of the total energy. In this context two different approaches are studied: the first applied to pure elastic contact (without deformation), formulated with penalty and augmented Lagrange; and a second one based on a constitutive model for contact with penetration. In this second approach, a penalty potential is used in the constitutive model, that restores the energy stored in the contact when no dissipative effects are present. On the other hand, the energy is dissipated consistently with the continuous model when friction and damping are considered.