957 resultados para proposal to state
Resumo:
La presencia, en Bolivia de un katarismo radical, promovido por los más recientes movimientos indigenistas, nos obliga a repensar las políticas de conocimiento criollo-mestizas. En este sentido, el presente artículo ubica la propuesta de "indianizar al q'ara" como una "pedagogía al revés" con la cual la ideología katarista radical de Felipe Quispe, alias "el Mallku", cuestiona desde abajo el mestizaje que los grupos dominantes promueven desde principios de siglo.
Resumo:
La Constitución ecuatoriana promulgada el 20 de octubre de 2008 prescribe en el art. 11.9 que la responsabilidad estatal consiste en la reparación de las violaciones a los derechos por la falta o deficiente prestación de servicios públicos o por acciones u omisiones en el ejercicio de potestades públicas, reparación que es integral conforme el principio garantista de protección los derechos previsto en el art. 86.3. El presente estudio aborda aspectos generales de la naturaleza jurídica de la responsabilidad estatal –de la irresponsabilidad a la responsabilidad estatal–, su evolución –de la concepción civil indemnizatoria a la concepción reparadora integral–, y los denominados títulos jurídicos de imputación objetiva –falla del servicio, desequilibrio de las cargas públicas–, que serán de trascendental importancia para la comprensión de nuevo régimen, y cuyos aspectos específicos ameritarían un examen más amplio.
Resumo:
The relationship between speed and crashes has been well established in the literature, with the consequence that speed reduction through enforced or other means should lead to a reduction in crashes. The extent to which the public regard speeding as a problem that requires enforcement is less clear. Analysis was conducted on public perceptions of antisocial behaviors including speeding traffic. The data was collected as part of the British Crime Survey, a face-to-face interview with UK residents on issues relating to crime. The antisocial behavior section required participants to state the degree to which they perceived 16 antisocial behaviors to be a problem in their area. Results revealed that speeding traffic was perceived as the greatest problem in local communities, regardless of whether respondents were male or female, young, middle aged, or old. The rating of speeding traffic as the greatest problem in the community was replicated in a second, smaller postal survey, where respondents also provided strong support for enforcement on residential roads, and indicated that traveling immediately above the speed limit on residential roads was unacceptable. Results are discussed in relation to practical implications for speed enforcement, and the prioritization of limited police resources. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.
Resumo:
Under the Public Bodies Bill 2010, the HFEA, cornerstone in the regulation of assisted reproduction technologies (ART) for the last twenty years, is due to be abolished. This implies that there is no longer a need for a dedicated regulator for ART and that the existing roles of the Authority as both operational compliance monitor, and instance of ethical evaluation, may be absorbed by existing healthcare regulators. This article presents a timely analysis of these disparate functions of the HFEA, charting reforms adopted in 2008 and assessing the impact of the current proposals. Taking assisted conception treatment as the focus activity, it will be shown that the last few years have seen a concentration on the HFEA as a technical regulator based upon the principles of Better Regulation, with little analysis of how the ethical responsibility of the Authority fits into this framework. The current proposal to abolish the HFEA continues to fail to address this crucial question. Notwithstanding the fact that the scope of the Authority's ethical role may be questioned, its abolition requires that the Government consider what alternatives exists - or need to be put in place - to provide both responsive operational regulation and a forum for ethical reflection and decision-making in an area which continues to pose regulatory challenges
Resumo:
Background and aims: GP-TCM is the 1st EU-funded Coordination Action consortium dedicated to traditional Chinese medicine (TCM) research. This paper aims to summarise the objectives, structure and activities of the consortium and introduces the position of the consortium regarding good practice, priorities, challenges and opportunities in TCM research. Serving as the introductory paper for the GPTCM Journal of Ethnopharmacology special issue, this paper describes the roadmap of this special issue and reports how the main outputs of the ten GP-TCM work packages are integrated, and have led to consortium-wide conclusions. Materials and methods: Literature studies, opinion polls and discussions among consortium members and stakeholders. Results: By January 2012, through 3 years of team building, the GP-TCM consortium had grown into a large collaborative network involving ∼200 scientists from 24 countries and 107 institutions. Consortium members had worked closely to address good practice issues related to various aspects of Chinese herbal medicine (CHM) and acupuncture research, the focus of this Journal of Ethnopharmacology special issue, leading to state-of-the-art reports, guidelines and consensus on the application of omics technologies in TCM research. In addition, through an online survey open to GP-TCM members and non-members, we polled opinions on grand priorities, challenges and opportunities in TCM research. Based on the poll, although consortium members and non-members had diverse opinions on the major challenges in the field, both groups agreed that high-quality efficacy/effectiveness and mechanistic studies are grand priorities and that the TCM legacy in general and its management of chronic diseases in particular represent grand opportunities. Consortium members cast their votes of confidence in omics and systems biology approaches to TCM research and believed that quality and pharmacovigilance of TCM products are not only grand priorities, but also grand challenges. Non-members, however, gave priority to integrative medicine, concerned on the impact of regulation of TCM practitioners and emphasised intersectoral collaborations in funding TCM research, especially clinical trials. Conclusions: The GP-TCM consortium made great efforts to address some fundamental issues in TCM research, including developing guidelines, as well as identifying priorities, challenges and opportunities. These consortium guidelines and consensus will need dissemination, validation and further development through continued interregional, interdisciplinary and intersectoral collaborations. To promote this, a new consortium, known as the GP-TCM Research Association, is being established to succeed the 3-year fixed term FP7 GP-TCM consortium and will be officially launched at the Final GP-TCM Congress in Leiden, the Netherlands, in April 2012.
Resumo:
English teachers in England have experienced a lengthy period of external constraint, increasingly controlling their practice. This constraint was originated in the 1989 National curriculum. Although in its first version it was in harmony with practice, its numerous revisions have moved it a long way from teachers’ own values and beliefs. This move is illustrated through research into the teaching of literature, which is seen by English teachers as often arid and driven by examinations alone. This period has been increasingly dominated by high-stakes testing, school league tables and frequent school inspections. Another powerful element has been the introduction of Standards for teachers at every career level from student teachers to the Advanced Skills Teachers. Research demonstrates that this introduction of Standards has had some beneficial effects. However, research also shows that the government decision to replace all these, hierarchically structured standards, with a single standard is seen by many teachers as a retrograde step. Evidence from Advanced Skills Teachers of English shows that the government’s additional proposal to bring in a Master Teacher standard is equally problematic. The decline of the National Association for the Teaching of English, the key subject association for English teachers, is discussed in relation to this increasingly negative and constraining environment, concluding that many English teachers are choosing a form of local resistance which, while understandable, weakens the credibility of the profession and erodes the influence of its key voice, NATE.
Resumo:
Transient and equilibrium sensitivity of Earth's climate has been calculated using global temperature, forcing and heating rate data for the period 1970–2010. We have assumed increased long-wave radiative forcing in the period due to the increase of the long-lived greenhouse gases. By assuming the change in aerosol forcing in the period to be zero, we calculate what we consider to be lower bounds to these sensitivities, as the magnitude of the negative aerosol forcing is unlikely to have diminished in this period. The radiation imbalance necessary to calculate equilibrium sensitivity is estimated from the rate of ocean heat accumulation as 0.37±0.03W m^−2 (all uncertainty estimates are 1−σ). With these data, we obtain best estimates for transient climate sensitivity 0.39±0.07K (W m^−2)^−1 and equilibrium climate sensitivity 0.54±0.14K (W m^−2)^−1, equivalent to 1.5±0.3 and 2.0±0.5K (3.7W m^−2)^−1, respectively. The latter quantity is equal to the lower bound of the ‘likely’ range for this quantity given by the 2007 IPCC Assessment Report. The uncertainty attached to the lower-bound equilibrium sensitivity permits us to state, within the assumptions of this analysis, that the equilibrium sensitivity is greater than 0.31K (W m^−2)^−1, equivalent to 1.16K(3.7W m^−2)^−1, at the 95% confidence level.
Resumo:
The work involves investigation of a type of wireless power system wherein its analysis will yield the construction of a prototype modeled as a singular technological artifact. It is through exploration of the artifact that forms the intellectual basis for not only its prototypical forms, but suggestive of variant forms not yet discovered. Through the process it is greatly clarified the role of the artifact, its most suitable application given the constraints on the delivery problem, and optimization strategies to improve it. In order to improve maturity and contribute to a body of knowledge, this document proposes research utilizing mid-field region, efficient inductive-transfer for the purposes of removing wired connections and electrical contacts. While the description seems enough to state the purpose of this work, it does not convey the compromises of having to redraw the lines of demarcation between near and far-field in the traditional method of broadcasting. Two striking scenarios are addressed in this thesis: Firstly, the mathematical explanation of wireless power is due to J.C. Maxwell's original equations, secondly, the behavior of wireless power in the circuit is due to Joseph Larmor's fundamental works on the dynamics of the field concept. A model of propagation will be presented which matches observations in experiments. A modified model of the dipole will be presented to address the phenomena observed in the theory and experiments. Two distinct sets of experiments will test the concept of single and two coupled-modes. In a more esoteric context of the zero and first-order magnetic field, the suggestion of a third coupled-mode is presented. Through the remaking of wireless power in this context, it is the intention of the author to show the reader that those things lost to history, bound to a path of complete obscurity, are once again innovative and useful ideas.
An LDA and probability-based classifier for the diagnosis of Alzheimer's Disease from structural MRI
Resumo:
In this paper a custom classification algorithm based on linear discriminant analysis and probability-based weights is implemented and applied to the hippocampus measurements of structural magnetic resonance images from healthy subjects and Alzheimer’s Disease sufferers; and then attempts to diagnose them as accurately as possible. The classifier works by classifying each measurement of a hippocampal volume as healthy controlsized or Alzheimer’s Disease-sized, these new features are then weighted and used to classify the subject as a healthy control or suffering from Alzheimer’s Disease. The preliminary results obtained reach an accuracy of 85.8% and this is a similar accuracy to state-of-the-art methods such as a Naive Bayes classifier and a Support Vector Machine. An advantage of the method proposed in this paper over the aforementioned state of the art classifiers is the descriptive ability of the classifications it produces. The descriptive model can be of great help to aid a doctor in the diagnosis of Alzheimer’s Disease, or even further the understand of how Alzheimer’s Disease affects the hippocampus.
Resumo:
This paper proposes a filter-based algorithm for feature selection. The filter is based on the partitioning of the set of features into clusters. The number of clusters, and consequently the cardinality of the subset of selected features, is automatically estimated from data. The computational complexity of the proposed algorithm is also investigated. A variant of this filter that considers feature-class correlations is also proposed for classification problems. Empirical results involving ten datasets illustrate the performance of the developed algorithm, which in general has obtained competitive results in terms of classification accuracy when compared to state of the art algorithms that find clusters of features. We show that, if computational efficiency is an important issue, then the proposed filter May be preferred over their counterparts, thus becoming eligible to join a pool of feature selection algorithms to be used in practice. As an additional contribution of this work, a theoretical framework is used to formally analyze some properties of feature selection methods that rely on finding clusters of features. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
For the first time, crystals of suitable size for X-ray diffractometry structure determination (Dian important anti-HI V drug were prepared under solvothermal conditions. In this study, the crystal structure of didanosine (2`,3`-dideoxyinosine, ddI) in the form of a hydrate was determined using single-crystal X-ray diffractometry. Powder X-ray diffraction analysis revealed that the solid-state phase of the drug incorporated into pharmaceutical solid dosage forms is isostructural to the solvothermally prepared ddI material, even though they do not exhibit an identical chemical composition due to different water fractions occupying hydrophobic channels formed within the crystal lattice. Two ddI conformers are present in the structure, in agreement with a previous structure elucidation attempt. Concerning the keto enol equilibrium of ddI, our crystal data and vibrational characterizations by Fourier transform infrared (FTIR) and FT-Raman spectroscopy techniques were conclusive to state that both conformers exist in the keto form, contrary to solid-state NMR spectroscopic assignments that suggested ddI molecules occur as enol tautomers. In addition, characterizations by thermal (differential scanning calorimetry) and spectroscopic techniques allowed us to understand the structural similarities and the differences related to the hydration pattern of the nonstoichiometric hydrates.
Resumo:
Consider a continuous-time Markov process with transition rates matrix Q in the state space Lambda boolean OR {0}. In In the associated Fleming-Viot process N particles evolve independently in A with transition rates matrix Q until one of them attempts to jump to state 0. At this moment the particle jumps to one of the positions of the other particles, chosen uniformly at random. When Lambda is finite, we show that the empirical distribution of the particles at a fixed time converges as N -> infinity to the distribution of a single particle at the same time conditioned on not touching {0}. Furthermore, the empirical profile of the unique invariant measure for the Fleming-Viot process with N particles converges as N -> infinity to the unique quasistationary distribution of the one-particle motion. A key element of the approach is to show that the two-particle correlations are of order 1/N.
Resumo:
Since the last decade the problem of surface inspection has been receiving great attention from the scientific community, the quality control and the maintenance of products are key points in several industrial applications.The railway associations spent much money to check the railway infrastructure. The railway infrastructure is a particular field in which the periodical surface inspection can help the operator to prevent critical situations. The maintenance and monitoring of this infrastructure is an important aspect for railway association.That is why the surface inspection of railway also makes importance to the railroad authority to investigate track components, identify problems and finding out the way that how to solve these problems. In railway industry, usually the problems find in railway sleepers, overhead, fastener, rail head, switching and crossing and in ballast section as well. In this thesis work, I have reviewed some research papers based on AI techniques together with NDT techniques which are able to collect data from the test object without making any damage. The research works which I have reviewed and demonstrated that by adopting the AI based system, it is almost possible to solve all the problems and this system is very much reliable and efficient for diagnose problems of this transportation domain. I have reviewed solutions provided by different companies based on AI techniques, their products and reviewed some white papers provided by some of those companies. AI based techniques likemachine vision, stereo vision, laser based techniques and neural network are used in most cases to solve the problems which are performed by the railway engineers.The problems in railway handled by the AI based techniques performed by NDT approach which is a very broad, interdisciplinary field that plays a critical role in assuring that structural components and systems perform their function in a reliable and cost effective fashion. The NDT approach ensures the uniformity, quality and serviceability of materials without causing any damage of that materials is being tested. This testing methods use some way to test product like, Visual and Optical testing, Radiography, Magnetic particle testing, Ultrasonic testing, Penetrate testing, electro mechanic testing and acoustic emission testing etc. The inspection procedure has done periodically because of better maintenance. This inspection procedure done by the railway engineers manually with the aid of AI based techniques.The main idea of thesis work is to demonstrate how the problems can be reduced of thistransportation area based on the works done by different researchers and companies. And I have also provided some ideas and comments according to those works and trying to provide some proposal to use better inspection method where it is needed.The scope of this thesis work is automatic interpretation of data from NDT, with the goal of detecting flaws accurately and efficiently. AI techniques such as neural networks, machine vision, knowledge-based systems and fuzzy logic were applied to a wide spectrum of problems in this area. Another scope is to provide an insight into possible research methods concerning railway sleeper, fastener, ballast and overhead inspection by automatic interpretation of data.In this thesis work, I have discussed about problems which are arise in railway sleepers,fastener, and overhead and ballasted track. For this reason I have reviewed some research papers related with these areas and demonstrated how their systems works and the results of those systems. After all the demonstrations were taking place of the advantages of using AI techniques in contrast with those manual systems exist previously.This work aims to summarize the findings of a large number of research papers deploying artificial intelligence (AI) techniques for the automatic interpretation of data from nondestructive testing (NDT). Problems in rail transport domain are mainly discussed in this work. The overall work of this paper goes to the inspection of railway sleepers, fastener, ballast and overhead.
Resumo:
HR-funktionens uppgift i en organisation är att tillvarata den mänskliga resursen och se till att goda arbetsförhållanden uppnås. Detta för att kunna attrahera, rekrytera, behålla och utveckla kompetens (Kira 2003). På senare år har arbetsförhållanden och arbetsvillkor uppmärksammats på kommuner i Sverige för bland annat enhetschefer i äldreomsorgen. Arbetsförhållandena, arbetsvillkoren och framför allt den höga personalomsättningen bland dessa, har lett till en problematik som var en av valets viktigaste frågor 2014. Syftet med denna studie är att beskriva arbetsförhållanden för enhetschefer inom mindre kommuner i Sverige, inom den sociala sektorn och belysa hur dessa skulle kunna förbättras. Som studieobjekt har Orsa kommun använts. Resultatet av undersökningen visade att arbetssituationen för enhetschefer är övermäktig, då det är hög arbetsbelastning samt dålig struktur i arbetet. Enhetscheferna själva skulle gynnas av en assistent samt en arbetsbeskrivning för att minska arbetsbelastningen och få struktur i arbetet. Vår slutsats är att kommuner i Sverige borde arbeta med att skapa bättre arbetsförhållanden för enhetschefer samt arbeta för att underlätta arbetsbördan. Vårt förslag till Orsa kommun är att ta hjälp av vår handlingsplan och därmed anställa assistenter till enhetscheferna samt skapa arbetsbeskrivningar. Vidare forskning i ämnet skulle kunna belysa mentorskapets betydelse i den offentliga sektorn inom kommuner i Sverige samt organisationsstrukturens bemärkelse för arbetets attraktivitet.