914 resultados para Errors and blunders, Literary.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

El trabajo de investigación surge en el año 2001, ante la necesidad de hacer frente a una nueva realidad jurídica, el mobbing. Para ello fue decisivo el estudio de lo publicado (básicamente de ramas ajenas al Derecho) pero sobre todo las entrevistas con las víctimas de mobbing y sus asociaciones; este extremo, unido a la ausencia de un tratamiento internacional, obligó a un camino autodidacta para definir mobbing jurídicamente. La Tesis define mobbing como la presión laboral tendente a la autoeliminación de un trabajador mediante su denigración (presión laboral tendenciosa), y con ello por primera vez se tiene una definición de mobbing en línea y media, con plena validez jurídica, que es susceptible de ser memorizada y por lo tanto divulgada, para corregir el problema. El denominado "concepto uniformado de mobbing" recalca la denigración como mecanismo frente a los tratos degradantes y recalca la autoeliminación como finalidad de un comportamiento doloso. El trabajo aporta fórmulas para deslindar casos de mobbing de otras figuras próximas, y en este sentido debe citarse "la regla del 9" para saber si hay mobbing; en sede de estadísticas se critican metodológicamente muchas de ellas presentadas hasta el momento y se aporta alguna en sede de Tribunales; pero sobre todo se advierte de los riesgos jurídicos de una previsible regulación específica antimobbing, mediante el examen de las distintas definiciones que se han esgrimido hasta el momento. La segunda parte de la Tesis profundiza sobre el grado de sensibilización de nuestro ordenamiento jurídico y Tribunales, a cuyo fin se ha trabajado con más de un centernar y medio de sentencias dictadas sobre la materia, y por supuesto la totalidad de las recogidas en las bases de datos de las editoriales. El análisis sirve para apreciar la bondad de la sistemática aquí defendida, poniendo en evidencia errores, y contradicciones. La Tesis advierte que la presión laboral tendenciosa más allá de vulnerar el derecho constitucional al trabajo, o los derechos fundamentales a la integridad moral y el honor, es una transgresión a todo un "espíritu constitucional", y en este sentido se analiza con detalle tanto la posibilidad de recurrir en amparo, como el derecho a la indemnidad para quien se enfrenta a esta situación. Advirtiendo de las ventajas de efectuar esta reacción mediante la modalidad procesal de tutela de los derechos fundamentales, se analiza la recurrida acción del art.50 ET, donde se realizan aportaciones sugerentes como el plazo prescripción o la "doctrina de los antecedentes", y se otorgan respuestas a las preguntas sobre obligación de seguir trabajando y ejecución provisional. En sede de acciones de Seguridad Social, la Tesis distingue entre la incapacidad temporal y permanente (depresiones) y la muerte y supervivencia, aportándose sobre la primera la técnica denominada "interpretación en tres niveles" y descartando la posibilidad de considerar accidente de trabajo el suicidio tras un mobbing por imperativo legal, pero aportando un sucedáneo bastante razonable como es el accidente no laboral. Junto a ello se razona por la viabilidad del recargo del art.123 LGSS. Civilmente, la Tesis se posiciona de "lege ferenda" por reconducir este tipo de acciones resarcitorias del daño psíquico y moral al orden civil, por una mayor explicación sobre el origen del quantum, pero sobre todo considera inadmisible la STS 11-3-04, y ello por una pluralidad de argumentos, pero sobre todo por cuanto viene a autorizar "de facto" este tipo de conductas. La posibilidad de accionar administrativamente frente a este riesgo psicosocial se analiza en un doble terreno, la empresa y la Administración. Si bien el cauce sobre el primero tiene algunos meandros que se desbelan, la situación es radicalmente frustrante en la Administración -donde se encuentra el mayor caldo de cultivo del mobbing- , y ello por el RD 707/2002, pero todavía en mayor medida por el Criterio Técnico 34/2003 mediante el cual la interpretación del Director General de la Inspección de Trabajo y Seguridad Social ha venido tácitamente a derogar parcialmente la Ley de Prevención de Riesgos Laborales para la Administración. En materia penal, la Tesis se decanta "a priori" por dos tipos penales, los delitos contra los derechos de los trabajadores, y el delito de trato degradante; sin embargo, en la práctica sólo este segundo es el camino que puede alcanzar buen puerto. Finalmente se realiza un estudio detallado de la Ley 62/2003, ley que se divulgó como reguladora del acoso moral, y que después se defiende como un avance frente al mobbing. La Tesis advierte que no es cierto ni lo uno, ni lo otro, habiendo creado un "espejismo legal" que puede perjudicar a las víctimas de mobbing, además de no servir su estructura para una futura regulación explícita antimobbing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este ensayo plantea que la literatura latinoamericana actual abandona su propia esfera: de la ficción bien definida, de los claros requisitos de un texto para constituir «literatura», incluso de las clasificaciones en géneros. Esa esfera de «lo literario», amparada en reglas e instituciones, pierde consistencia, se limita entonces el poder y la capacidad de presión política que la literatura tuvo hasta hace pocas décadas. La actual literatura, finalizada la época de su autonomía, hablar de una vida moderna en la cual «todo lo cultural es económico» y donde «toda ficción es realidad» (y viceversa, en ambos casos). Lo cotidiano es ahora la vida, pero intervenida por las tecnologías de la información y la comunicación (que le prestan rasgos de virtualidad o de irrealidad). Se plantea con ellas el desafío de intentar una lectura desde otros parámetros, en caso contrario, se puede caer en la simpleza de calificarlas como no-literatura o «literatura mala».

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este ensayo examina los debates y las relecturas a propósito del llamado Barroco de Indias que se han originado en el ámbito de los estudios latinoamericanos en las últimas décadas. Se enfoca en los estudios sobre el escritor peruano Juan Espinosa Medrano C1629?-1688, llamado el ""Lunarejo"", y problematiza los postulados que claman ver en el barroco y en sus representantes literarios, los primeros procesos de definición de una identidad y modernidad ""americana"" propias. El autor considera que estas lecturas, a pesar de que buscan responder a interpretaciones colonialistas, rearticulan un proyecto latinoamericanista que excluye las conflictivas relaciones étnico-culturales entre indígenas y no indígenas, así como también refuerza lo que Aníbal Quijano y Walter Mignolo llaman la colonialidad del poder.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El 4 de octubre de 2012, la profesora Cecilia Mafla Bustamante entrevistó al conocido poeta cuencano Efraín Jara Idrovo. En este diálogo el escritor narra su trayectoria poética, su ideología política y sus influencias literarias nacionales e internacionales. También hace reflexiones, conjuntamente con su hijo Johnny Jara, sobre el poema Sollozo por Pedro Jara, considerado su mejor obra. Además, examina la estructura semiótica del signo lingüístico y su carácter biplano que mira hacia el sentido y hacia la materialidad del signo, según la teoría de Jan Mukařovský. Profundiza su pensamiento existencial en el concepto “el mundo es la configuración de la conciencia”, y finalmente medita sobre el proceso de la escritura y la producción poética en el Ecuador.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the~HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

These notes have been issued on a small scale in 1983 and 1987 and on request at other times. This issue follows two items of news. First, WaIter Colquitt and Luther Welsh found the 'missed' Mersenne prime M110503 and advanced the frontier of complete Mp-testing to 139,267. In so doing, they terminated Slowinski's significant string of four consecutive Mersenne primes. Secondly, a team of five established a non-Mersenne number as the largest known prime. This result terminated the 1952-89 reign of Mersenne primes. All the original Mersenne numbers with p < 258 were factorised some time ago. The Sandia Laboratories team of Davis, Holdridge & Simmons with some little assistance from a CRAY machine cracked M211 in 1983 and M251 in 1984. They contributed their results to the 'Cunningham Project', care of Sam Wagstaff. That project is now moving apace thanks to developments in technology, factorisation and primality testing. New levels of computer power and new computer architectures motivated by the open-ended promise of parallelism are now available. Once again, the suppliers may be offering free buildings with the computer. However, the Sandia '84 CRAY-l implementation of the quadratic-sieve method is now outpowered by the number-field sieve technique. This is deployed on either purpose-built hardware or large syndicates, even distributed world-wide, of collaborating standard processors. New factorisation techniques of both special and general applicability have been defined and deployed. The elliptic-curve method finds large factors with helpful properties while the number-field sieve approach is breaking down composites with over one hundred digits. The material is updated on an occasional basis to follow the latest developments in primality-testing large Mp and factorising smaller Mp; all dates derive from the published literature or referenced private communications. Minor corrections, additions and changes merely advance the issue number after the decimal point. The reader is invited to report any errors and omissions that have escaped the proof-reading, to answer the unresolved questions noted and to suggest additional material associated with this subject.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims To investigate the effects of electronic prescribing (EP) on prescribing quality, as indicated by prescribing errors and pharmacists' clinical interventions, in a UK hospital. Methods Prescribing errors and pharmacists' interventions were recorded by the ward pharmacist during a 4 week period both pre- and post-EP, with a second check by the principal investigator. The percentage of new medication orders with a prescribing error and/or pharmacist's intervention was calculated for each study period. Results Following the introduction of EP, there was a significant reduction in both pharmacists' interventions and prescribing errors. Interventions reduced from 73 (3.0% of all medication orders) to 45 (1.9%) (95% confidence interval (CI) for the absolute reduction 0.2, 2.0%), and errors from 94 (3.8%) to 48 (2.0%) (95% CI 0.9, 2.7%). Ten EP-specific prescribing errors were identified. Only 52% of pharmacists' interventions related to a prescribing error pre-EP, and 60% post-EP; only 40% and 56% of prescribing errors resulted in an intervention pre- and post-EP, respectively. Conclusions EP improved the quality of prescribing by reducing both prescribing errors and pharmacists' clinical interventions. Prescribers and pharmacists need to be aware of new types of error with EP, so that they can best target their activities to reduce clinical risk. Pharmacists may need to change the way they work to complement, rather than duplicate, the benefits of EP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The principles of operation of an experimental prototype instrument known as J-SCAN are described along with the derivation of formulae for the rapid calculation of normalized impedances; the structure of the instrument; relevant probe design parameters; digital quantization errors; and approaches for the optimization of single frequency operation. An eddy current probe is used As the inductance element of a passive tuned-circuit which is repeatedly excited with short impulses. Each impulse excites an oscillation which is subject to decay dependent upon the values of the tuned-circuit components: resistance, inductance and capacitance. Changing conditions under the probe that affect the resistance and inductance of this circuit will thus be detected through changes in the transient response. These changes in transient response, oscillation frequency and rate of decay, are digitized, and then normalized values for probe resistance and inductance changes are calculated immediately in a micro processor. This approach coupled with a minimum analogue processing and maximum of digital processing has advantages compared with the conventional approaches to eddy current instruments. In particular there are: the absence of an out of balance condition and the flexibility and stability of digital data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper provides one of the first applications of the double bootstrap procedure (Simar and Wilson 2007) in a two-stage estimation of the effect of environmental variables on non-parametric estimates of technical efficiency. This procedure enables consistent inference within models explaining efficiency scores, while simultaneously producing standard errors and confidence intervals for these efficiency scores. The application is to 88 livestock and 256 crop farms in the Czech Republic, split into individual and corporate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper provides one of the first applications of the double bootstrap procedure (Simar and Wilson 2007) in a two-stage estimation of the effect of environmental variables on non-parametric estimates of technical efficiency. This procedure enables consistent inference within models explaining efficiency scores, while simultaneously producing standard errors and confidence intervals for these efficiency scores. The application is to 88 livestock and 256 crop farms in the Czech Republic, split into individual and corporate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is associated with the goal of the horticultural sector of the Colombian southwest, which is to obtain climatic information, specifically, to predict the monthly average temperature in sites where it has not been measured. The data correspond to monthly average temperature, and were recorded in meteorological stations at Valle del Cauca, Colombia, South America. Two components are identified in the data of this research: (1) a component due to the temporal aspects, determined by characteristics of the time series, distribution of the monthly average temperature through the months and the temporal phenomena, which increased (El Nino) and decreased (La Nina) the temperature values, and (2) a component due to the sites, which is determined for the clear differentiation of two populations, the valley and the mountains, which are associated with the pattern of monthly average temperature and with the altitude. Finally, due to the closeness between meteorological stations it is possible to find spatial correlation between data from nearby sites. In the first instance a random coefficient model without spatial covariance structure in the errors is obtained by month and geographical location (mountains and valley, respectively). Models for wet periods in mountains show a normal distribution in the errors; models for the valley and dry periods in mountains do not exhibit a normal pattern in the errors. In models of mountains and wet periods, omni-directional weighted variograms for residuals show spatial continuity. The random coefficient model without spatial covariance structure in the errors and the random coefficient model with spatial covariance structure in the errors are capturing the influence of the El Nino and La Nina phenomena, which indicates that the inclusion of the random part in the model is appropriate. The altitude variable contributes significantly in the models for mountains. In general, the cross-validation process indicates that the random coefficient model with spatial spherical and the random coefficient model with spatial Gaussian are the best models for the wet periods in mountains, and the worst model is the model used by the Colombian Institute for Meteorology, Hydrology and Environmental Studies (IDEAM) to predict temperature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone. The methods are also limited to optical see-through HMDs. Building on our existing HMD calibration method [1], we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside an HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in various positions. The locations of image features on the calibration object are then re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the display’s intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner in both see-through and in non-see-through modes and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors and involves no error-prone human measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The availability of a network strongly depends on the frequency of service outages and the recovery time for each outage. The loss of network resources includes complete or partial failure of hardware and software components, power outages, scheduled maintenance such as software and hardware, operational errors such as configuration errors and acts of nature such as floods, tornadoes and earthquakes. This paper proposes a practical approach to the enhancement of QoS routing by means of providing alternative or repair paths in the event of a breakage of a working path. The proposed scheme guarantees that every Protected Node (PN) is connected to a multi-repair path such that no further failure or breakage of single or double repair paths can cause any simultaneous loss of connectivity between an ingress node and an egress node. Links to be protected in an MPLS network are predefined and an LSP request involves the establishment of a working path. The use of multi-protection paths permits the formation of numerous protection paths allowing greater flexibility. Our analysis will examine several methods including single, double and multi-repair routes and the prioritization of signals along the protected paths to improve the Quality of Service (QoS), throughput, reduce the cost of the protection path placement, delay, congestion and collision.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rainfall can be modeled as a spatially correlated random field superimposed on a background mean value; therefore, geostatistical methods are appropriate for the analysis of rain gauge data. Nevertheless, there are certain typical features of these data that must be taken into account to produce useful results, including the generally non-Gaussian mixed distribution, the inhomogeneity and low density of observations, and the temporal and spatial variability of spatial correlation patterns. Many studies show that rigorous geostatistical analysis performs better than other available interpolation techniques for rain gauge data. Important elements are the use of climatological variograms and the appropriate treatment of rainy and nonrainy areas. Benefits of geostatistical analysis for rainfall include ease of estimating areal averages, estimation of uncertainties, and the possibility of using secondary information (e.g., topography). Geostatistical analysis also facilitates the generation of ensembles of rainfall fields that are consistent with a given set of observations, allowing for a more realistic exploration of errors and their propagation in downstream models, such as those used for agricultural or hydrological forecasting. This article provides a review of geostatistical methods used for kriging, exemplified where appropriate by daily rain gauge data from Ethiopia.