912 resultados para Compositional Rule of Inference
Resumo:
The chloride extraction rule of iron artifacts was studied by electrical methods. The effect of the current and potential value on the desalination result of simulated iron artifacts was studied through the galvanostatic and potentiostatic experiments the ingredients of the rust before and after treatments were also analyzed by the X-ray diffraction (XRD). It has been found that the optimal current density was between -0.50 and -0.75 mA/cm(2) and the optimal potential was between -1.175 and -1.200 V. The phase of the samples rusts transformed after treatment, as well as the anti-corrosion performance improved.
Resumo:
In the principles-and-parameters model of language, the principle known as "free indexation'' plays an important part in determining the referential properties of elements such as anaphors and pronominals. This paper addresses two issues. (1) We investigate the combinatorics of free indexation. In particular, we show that free indexation must produce an exponential number of referentially distinct structures. (2) We introduce a compositional free indexation algorithm. We prove that the algorithm is "optimal.'' More precisely, by relating the compositional structure of the formulation to the combinatorial analysis, we show that the algorithm enumerates precisely all possible indexings, without duplicates.
Resumo:
Gibbs, N., Getting Constitutional Theory into Proportion: A Matter of Interpretation?, Oxford Journal of Legal Studies, 27 (1), 175-191. RAE2008
Resumo:
This thesis interrogates the construction of fairness to the accused in historic child sexual abuse trials in Ireland. The protection of fairness is a requirement of any trial that claims to adhere to the rule of law. Historic child sexual abuse trials, in which the charges relate to events that are alleged to have taken place decades previously, present serious challenges to the ability of the trial process to safeguard fairness. They are a litmus test of the courts’ commitment to fairness. The thesis finds that in historic abuse trials fairness to the accused has been significantly eroded and that therefore the Irish Courts have failed to respect the core of the rule of law in these most serious of prosecutions. The thesis scrutinises two bodies of case law, both of which deal with the issue of whether evidence should reach the jury. First, it examines the decisions on applications brought by defendants seeking to prohibit their trial. The courts hearing prohibition applications face a dilemma: how to ensure the defendant is not put at risk of an unfair trial, while at the same time recognising that delay in reporting is a defining feature of these cases. The thesis traces the development of the prohibition case law and tracks the shifting interpretations given to fairness by the courts. Second, the thesis examines what fairness means in the superior courts’ decisions regarding the admissibility of the following kinds of evidence, each of which presents particular challenges to the ability of the trial to safeguard fairness: evidence of multiple complainants; evidence of recovered memories and evidence of complainants’ therapeutic records. The thesis finds that in both bodies of case law the Irish courts have hollowed out the meaning of fairness. It makes proposals on how fairness might be placed at the heart of courts’ decisions on admissibility in historic abuse trials. The thesis concludes that the erosion of fairness in historic abuse trials is indicative of a move away from the liberal model of criminal justice. It cautions that unless fairness is prioritised in historic child sexual abuse trials the legitimacy of these trials and that of all Irish criminal trials will be contestable.
Resumo:
The main objective of this thesis is the critical analysis of the evolution of the criminal justice systems throughout the past decade, with special attention to the fight against transnational terrorism. It is evident – for any observer - that such threats and the associated risk that terrorism entails, has changed significantly throughout the past decade. This perception has generated answers – many times radical ones – by States, as they have committed themselves to warrant the safety of their populations and to ease a growing sentiment of social panic. This thesis seeks to analyse the characteristics of this new threat and the responses that States have developed in the fight against terrorism since 9/11, which have questioned some of the essential principles and values in place in their own legal systems. In such sense, freedom and security are placed into perspective throughout the analysis of the specific antiterrorist legal reforms of five different States: Israel, Portugal, Spain, the United Kingdom and the United States of America. On the other hand, in light of those antiterrorist reforms, it will be questioned if it is possible to speak of the emergence of a new system of criminal justice (and of a process of a convergence between common law and civil law systems), built upon a control and preventive security framework, significantly different from traditional models. Finally, this research project has the fundamental objective to contribute to a better understanding on the economic, social and civilization costs of those legal reforms regarding human rights, the rule of law and democracy in modern States.
Resumo:
Electron microscopy (EM) has advanced in an exponential way since the first transmission electron microscope (TEM) was built in the 1930’s. The urge to ‘see’ things is an essential part of human nature (talk of ‘seeing is believing’) and apart from scanning tunnel microscopes which give information about the surface, EM is the only imaging technology capable of really visualising atomic structures in depth down to single atoms. With the development of nanotechnology the demand to image and analyse small things has become even greater and electron microscopes have found their way from highly delicate and sophisticated research grade instruments to key-turn and even bench-top instruments for everyday use in every materials research lab on the planet. The semiconductor industry is as dependent on the use of EM as life sciences and pharmaceutical industry. With this generalisation of use for imaging, the need to deploy advanced uses of EM has become more and more apparent. The combination of several coinciding beams (electron, ion and even light) to create DualBeam or TripleBeam instruments for instance enhances the usefulness from pure imaging to manipulating on the nanoscale. And when it comes to the analytic power of EM with the many ways the highly energetic electrons and ions interact with the matter in the specimen there is a plethora of niches which evolved during the last two decades, specialising in every kind of analysis that can be thought of and combined with EM. In the course of this study the emphasis was placed on the application of these advanced analytical EM techniques in the context of multiscale and multimodal microscopy – multiscale meaning across length scales from micrometres or larger to nanometres, multimodal meaning numerous techniques applied to the same sample volume in a correlative manner. In order to demonstrate the breadth and potential of the multiscale and multimodal concept an integration of it was attempted in two areas: I) Biocompatible materials using polycrystalline stainless steel and II) Semiconductors using thin multiferroic films. I) The motivation to use stainless steel (316L medical grade) comes from the potential modulation of endothelial cell growth which can have a big impact on the improvement of cardio-vascular stents – which are mainly made of 316L – through nano-texturing of the stent surface by focused ion beam (FIB) lithography. Patterning with FIB has never been reported before in connection with stents and cell growth and in order to gain a better understanding of the beam-substrate interaction during patterning a correlative microscopy approach was used to illuminate the patterning process from many possible angles. Electron backscattering diffraction (EBSD) was used to analyse the crystallographic structure, FIB was used for the patterning and simultaneously visualising the crystal structure as part of the monitoring process, scanning electron microscopy (SEM) and atomic force microscopy (AFM) were employed to analyse the topography and the final step being 3D visualisation through serial FIB/SEM sectioning. II) The motivation for the use of thin multiferroic films stems from the ever-growing demand for increased data storage at lesser and lesser energy consumption. The Aurivillius phase material used in this study has a high potential in this area. Yet it is necessary to show clearly that the film is really multiferroic and no second phase inclusions are present even at very low concentrations – ~0.1vol% could already be problematic. Thus, in this study a technique was developed to analyse ultra-low density inclusions in thin multiferroic films down to concentrations of 0.01%. The goal achieved was a complete structural and compositional analysis of the films which required identification of second phase inclusions (through elemental analysis EDX(Energy Dispersive X-ray)), localise them (employing 72 hour EDX mapping in the SEM), isolate them for the TEM (using FIB) and give an upper confidence limit of 99.5% to the influence of the inclusions on the magnetic behaviour of the main phase (statistical analysis).
Resumo:
The requirement for a very accurate dependence analysis to underpin software tools to aid the generation of efficient parallel implementations of scalar code is argued. The current status of dependence analysis is shown to be inadequate for the generation of efficient parallel code, causing too many conservative assumptions to be made. This paper summarises the limitations of conventional dependence analysis techniques, and then describes a series of extensions which enable the production of a much more accurate dependence graph. The extensions include analysis of symbolic variables, the development of a symbolic inequality disproof algorithm and its exploitation in a symbolic Banerjee inequality test; the use of inference engine proofs; the exploitation of exact dependence and dependence pre-domination attributes; interprocedural array analysis; conditional variable definition tracing; integer array tracing and division calculations. Analysis case studies on typical numerical code is shown to reduce the total dependencies estimated from conventional analysis by up to 50%. The techniques described in this paper have been embedded within a suite of tools, CAPTools, which combines analysis with user knowledge to produce efficient parallel implementations of numerical mesh based codes.
Resumo:
The decisions animals make about how long to wait between activities can determine the success of diverse behaviours such as foraging, group formation or risk avoidance. Remarkably, for diverse animal species, including humans, spontaneous patterns of waiting times show random ‘burstiness’ that appears scale-invariant across a broad set of scales. However, a general theory linking this phenomenon across the animal kingdom currently lacks an ecological basis. Here, we demonstrate from tracking the activities of 15 sympatric predator species (cephalopods, sharks, skates and teleosts) under natural and controlled conditions that bursty waiting times are an intrinsic spontaneous behaviour well approximated by heavy-tailed (power-law) models over data ranges up to four orders of magnitude. Scaling exponents quantifying ratios of frequent short to rare very long waits are species-specific, being determined by traits such as foraging mode (active versus ambush predation), body size and prey preference. A stochastic–deterministic decision model reproduced the empirical waiting time scaling and species-specific exponents, indicating that apparently complex scaling can emerge from simple decisions. Results indicate temporal power-law scaling is a behavioural ‘rule of thumb’ that is tuned to species’ ecological traits, implying a common pattern may have naturally evolved that optimizes move–wait decisions in less predictable natural environments.
Resumo:
In many domains when we have several competing classifiers available we want to synthesize them or some of them to get a more accurate classifier by a combination function. In this paper we propose a ‘class-indifferent’ method for combining classifier decisions represented by evidential structures called triplet and quartet, using Dempster's rule of combination. This method is unique in that it distinguishes important elements from the trivial ones in representing classifier decisions, makes use of more information than others in calculating the support for class labels and provides a practical way to apply the theoretically appealing Dempster–Shafer theory of evidence to the problem of ensemble learning. We present a formalism for modelling classifier decisions as triplet mass functions and we establish a range of formulae for combining these mass functions in order to arrive at a consensus decision. In addition we carry out a comparative study with the alternatives of simplet and dichotomous structure and also compare two combination methods, Dempster's rule and majority voting, over the UCI benchmark data, to demonstrate the advantage our approach offers. (A continuation of the work in this area that was published in IEEE Trans on KDE, and conferences)
Resumo:
Purpose – This paper aims to examine the growing incidence of judicialisation of politics in Nigeria’s democratisation experience against the backdrop of questionable judicial accountability. Design/methodology/approach – The article draws on legal and political theory as well as comparative law perspectives. Findings – The judiciary faces a daunting task in deepening democracy and (re) instituting the rule of law. The formidable challenges derive in part from structural problems within the judiciary, deficient accountability credentials and the complexities of a troubled transition. Practical implications – Effective judicial mediation of political transition requires a transformed and accountable judiciary. Originality/value – The article calls attention to the need for judicial accountability as a cardinal and integral part of political transitions. Keywords Democracy, Politics, Law, Nigeria, Africa Paper type Viewpoint
Resumo:
This account of judicialised politics in the Nigerian transition experience examines the regulation of the judiciary of the political space, through the resolution of intergovernmental contestations in a dysfunctional federation. It analyses the judicialisation of elite power disputes which have resonance for due process and the rule of law in particular and governance in general. A study of the role of the judiciary in stabilising the country, itself a pivot in the West Africa region in particular and Africa in general, is important. This is especially in view of its classification as a ‘weak state,’ despite its enormous human and natural resources. The analyses here suggest the Supreme Court has taken a strategic position in the task of democratic institutional building and the reinstitution of the rule of law in the country. This strategic measure has received the acclaim of the public. However, the account also discloses that the judiciary, in the course of its numerous interventions, has been drawn into overly political disputes that overreach its jurisprudential preferences. Of even more significance, it demonstrates that the judiciary is itself still challenged by institutional dysfunctions constituting part of the legacies of the authoritarian era. The situation leads back to the need for closer scrutiny of the judicial function in transitional societies.
Resumo:
Following its transition to democracy from an authoritarian military rule marked by gross violations of human rights, Nigeria established the Human Rights Violations Investigations Commission (HRVIC) in 1999. This paper critically examines the contributions of the HRVIC, popularly known as the ‘Oputa Panel,’ to the field of transitional justice and the rule of law. It sets out the process of establishing the Commission, its mandate and how this mandate was interpreted during the course of the Commission’s work. The challenges faced by the Oputa Panel, particularly those that relate to its legal status and relationship with the judiciary, are analyzed in an attempt to draw useful guidelines from these challenges for other truth commissions. Recourse by powerful individuals to the judicial process in a bid to shield themselves from the HRVIC merits particular review as it raises questions regarding the transformation of the judiciary and the rule of law in the wake of an authoritarian regime.
Resumo:
Using Northern Ireland as a case study, this paper explores how lawyers responded to the challenges of entrenched discrimination, sustained political violence and an emerging peace process. Drawing upon the literature of the sociology of lawyering, it examines whether lawyers can or should be more than ‘paid technicians’ in such circumstances. It focuses in particular upon a number of ‘critical junctures’ in the legal history of the jurisdiction and uncouples key elements of the local legal culture which contributed to an ethos of quietism. The paper argues that the version of legal professionalism that emerged in Northern Ireland was contingent and socially constructed and, with notable exceptions, obfuscated a collective failure of moral courage. It concludes that facing the truth concerning past silence is fundamental to a properly embedded rule of law and a more grounded notion of what it means to be a lawyer in a conflict.
Resumo:
This is a survey of the applicable international human rights standards concerning the right which alleged terrorists have to access a lawyer.
Resumo:
The Irish and UK governments, along with other countries, have made a commitment to limit the concentrations of greenhouse gases in the atmosphere by reducing emissions from the burning of fossil fuels. This can be achieved (in part) through increasing the sequestration of CO2 from the atmosphere including monitoring the amount stored in vegetation and soils. A large proportion of soil carbon is held within peat due to the relatively high carbon density of peat and organic-rich soils. This is particularly important for a country such as Ireland, where some 16% of the land surface is covered by peat. For Northern Ireland, it has been estimated that the total amount of carbon stored in vegetation is 4.4Mt compared to 386Mt stored within peat and soils. As a result it has become increasingly important to measure and monitor changes in stores of carbon in soils. The conservation and restoration of peat covered areas, although ongoing for many years, has become increasingly important. This is summed up in current EU policy outlined by the European Commission (2012) which seeks to assess the relative contributions of the different inputs and outputs of organic carbon and organic matter to and from soil. Results are presented from the EU-funded Tellus Border Soil Carbon Project (2011 to 2013) which aimed to improve current estimates of carbon in soil and peat across Northern Ireland and the bordering counties of the Republic of Ireland.
Historical reports and previous surveys provide baseline data. To monitor change in peat depth and soil organic carbon, these historical data are integrated with more recently acquired airborne geophysical (radiometric) data and ground-based geochemical data generated by two surveys, the Tellus Project (2004-2007: covering Northern Ireland) and the EU-funded Tellus Border project (2011-2013) covering the six bordering counties of the Republic of Ireland, Donegal, Sligo, Leitrim, Cavan, Monaghan and Louth. The concept being applied is that saturated organic-rich soil and peat attenuate gamma-radiation from underlying soils and rocks. This research uses the degree of spatial correlation (coregionalization) between peat depth, soil organic carbon (SOC) and the attenuation of the radiometric signal to update a limited sampling regime of ground-based measurements with remotely acquired data. To comply with the compositional nature of the SOC data (perturbations of loss on ignition [LOI] data), a compositional data analysis approach is investigated. Contemporaneous ground-based measurements allow corroboration for the updated mapped outputs. This provides a methodology that can be used to improve estimates of soil carbon with minimal impact to sensitive habitats (like peat bogs), but with maximum output of data and knowledge.