21 resultados para Remedy

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article presents a rationale for communicative, conceptual, cognitive and procedural challenges experienced by litigants in person in financial remedy proceedings. The article also explores oscillation between written and spoken legal genres and narrative development strategies which litigants in person have to use throughout different stages (from the early stages of starting proceedings, filling in court forms and providing documentation, through the negotiation process to interaction in court). While legal professionals express themselves in paradigmatic legal mode influenced by legal acts and legislation, litigants in person tend to express themselves in narrative mode similar to everyday storytelling. The objective is to investigate obstacles litigants in person experience during the process originally designed by legal professionals for legal professionals. The article evaluates different options for empowering lay people involved in legal proceedings and argues for the need to provide more specific support for different stages of family proceedings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

UK schools and universities are trying to remedy a nationally recognized skills shortage in quantitative methods among graduates. This article describes and analyses a research project in German Studies funded by the Economic and Social Research Council (ESRC) aimed at addressing the issue. The interdisciplinary pilot project introduced quantitative methods into undergraduate curricula not only in Linguistics, but also in German Studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Comments on the refusal of the English courts to recognise the existence of a remedy of partial rescission, suggesting that in certain restricted instances justification exists for the grant of such a remedy. Considers the nature of the remedy of rescission under English law, the English courts' approach towards partial rescission and the nature and scope of the discretions available to the courts, noting the decisions in TSB Bank Plc v Camfield and De Molestina v Ponton. Reviews the historical origins of the remedy of rescission, including the distinction between fraudulent and non fraudulent misrepresentation and the origins of the so called concurrent and auxiliary equitable jurisdictions. Compares the approach of the Australian courts and highlights examples of recognition of partial rescission under international law.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Describes the impact of the English Landlord and Tenant (Covenants) Act 1995, reforming liability in the context of new leases, extending the 'touching and concerning' requirement so all covenants 'run with the land' (with some exceptions), and abolishing the enduring liability of the original tenants and landlords. Explains that landlords will have more freedom to prescribe in advance the circumstances in which they consent to an assignment, referring also to changes in default notices requiring an 'early warning' to defaulters, and overriding leases, with a remedy for former tenants. Expects future leases to be shorter as landlords realize they cannot hold original tenants liable any more.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The treatment of effluents produced during the manufacture of metallurgical coke is normally carried out using the activated sludge process. The efficiency of activated sludges in purifying coke oven effluent depends largely on the maintenance of species of micro-organisms which destroy thiocyanate. The composition, production, toxicity and treatment of coke oven effluent at Corby steelworks are described. A review is presented which follows the progress made towards identifying and monitoring the species of bacteria which destroy thiocyanate in biological treatment plants purifying coke oven effluents. In the present study a search for bacteria capable of destroying thiocyanate led to the isolation of a species of bacteria, identified as Pseudomonas putida, which destroyed thiocyanate in the presence of succinate; this species had not previously been reported to use thiocyanate. Washed cell suspensions of P. putida destroyed phenol and thiocyanate simultaneously and thiocyanate destruction was not suppressed by pyridine, aniline or catechol at the highest concentrations normally encountered in coke oven effluent. The isolate has been included, as N.C.I.B. 11198, in the National Collection of Industrial Bacteria, Torrey Research Station, Aberdeen. Three other isolates, identified as Achromobacter sp., Thiobacillus thioparus and T. denitrificans, were also confirmed to destroy thi.ocyanate. A technique has been developed for monitoring populations of different species of bacteria in activated sludges. Application of this technique to laboratory scale and full scale treatment plants at Corby showed that thiobacilli were usually not detected; thiobacilli were el~inated during the commissioning period of the full scale plant. However experiments using a laboratory scale plant indicated that during a period of three weeks an increase in the numbers of thiobacilli might have contributed to an improvement in plant performance. Factors which might have facilitated the development of thiobacilli are discussed. Large numbers of fluorescent pseudomonads capable of using thiocyanate were sometimes detected in the laboratory scale plant. The possibility is considered that catechol or other organic compounds in the feed-liquor might have stimulated fluorescent pseudmonads. Experiments using the laboratory scale plant confirmed that deteriorations in the efficiency of thiocyanate destruction were sometimes caused by bulking sludges, due to the excessive growth of fungal floes. Increased dilution of the coke oven effluent was a successful remedy to this difficulty. The optimum operating conditions recommended by the manufacturer of the full scale activated sludge plant at Corby are assessed and the role of bacterial monitoring in a programme of regular monitoring tests is discussed in relation to the operation of activated sludge plants treating coke oven effluents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The traditional waterfall software life cycle model has several weaknesses. One problem is that a working version of a system is unavailable until a late stage in the development; any omissions and mistakes in the specification undetected until that stage can be costly to maintain. The operational approach which emphasises the construction of executable specifications can help to remedy this problem. An operational specification may be exercised to generate the behaviours of the specified system, thereby serving as a prototype to facilitate early validation of the system's functional requirements. Recent ideas have centred on using an existing operational method such as JSD in the specification phase of object-oriented development. An explicit transformation phase following specification is necessary in this approach because differences in abstractions between the two domains need to be bridged. This research explores an alternative approach of developing an operational specification method specifically for object-oriented development. By incorporating object-oriented concepts in operational specifications, the specifications have the advantage of directly facilitating implementation in an object-oriented language without requiring further significant transformations. In addition, object-oriented concepts can help the developer manage the complexity of the problem domain specification, whilst providing the user with a specification that closely reflects the real world and so the specification and its execution can be readily understood and validated. A graphical notation has been developed for the specification method which can capture the dynamic properties of an object-oriented system. A tool has also been implemented comprising an editor to facilitate the input of specifications, and an interpreter which can execute the specifications and graphically animate the behaviours of the specified systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Why some physicians recommend herbal medicines while others do not is not well understood. We undertook a survey designed to identify factors, which predict recommendation of herbal medicines by physicians in Malaysia. About a third (206 out of 626) of the physicians working at the University of Malaya Medical Centre ' were interviewed face-to-face, using a structured questionnaire. Physicians were asked about their personal use of, recommendation of, perceived interest in and, usefulness and safety of herbal medicines. Using logistic regression modelling we identified personal use, general interest, interest in receiving training, race and higher level of medical training as significant predictors of recommendation. St. John's wort is one of the most widely used herbal remedies. It is also probably the most widely evaluated herbal remedy with no fewer than 57 randomised controlled trials. Evidence from the depression trials suggests that St. John's wort is more effective than placebo while its comparative efficacy to conventional antidepressants is not well established. We updated previous meta-analyses of St. John's wort, described the characteristics of the included trials, applied methods of data imputation and transformation for incomplete trial data and examined sources of heterogeneity in the design and results of those trials. Thirty randomised controlled trials, which were heterogeneous in design, were identified. Our meta-analysis showed that St. John's wort was significantly more effective than placebo [pooled RR 1.90 (1.54-2.35)] and [Pooled WMD 4.09 (2.33 to 5.84)]. However, the remedy was similar to conventional antidepressant in its efficacy [Pooled RR I. 0 I (0.93 -1.10)] and [Pooled WMD 0.18 (- 0.66 to 1.02). Subgroup analyses of the placebo-controlled trials suggested that use of different diagnostic classifications at the inclusion stage led to different estimates of effect. Similarly a significant difference in the estimates of efficacy was observed when trials were categorised according to length of follow-up. Confounding between the variables, diagnostic classification and length of trial was shown by loglinear analysis. Despite extensive study, there is still no consensus on how effective St. lohn's wort is in depression. However, most experts would agree that it has some effect. Our meta-analysis highlights the problems associated with the clinical evaluation of herbal medicines when the active ingredients are poorly defined or unknown. The problem is compounded when the target disease (e.g. depression) is also difficult to define and different instruments are available to diagnose and evaluate it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work described in the following pages was carried out at various sites in the Rod Division of the Delta Metal Company. Extensive variation in the level of activity in the industry during the years 1974 to I975 had led to certain inadequacies being observed 1n the traditional cost control procedure. In an attempt to remedy this situation it was suggested that a method be found of constructing a system to improve the flexibility of cost control procedures. The work involved an assimilation of the industrial and financial environment via pilot studies which would later prove invaluable to home in on the really interesting and important areas. Weaknesses in the current systems which came to light made the methodology of data collection and the improvement of cost control and profit planning procedures easier to adopt. Because of the requirements of the project to investigate the implications of Cost behaviour for profit planning and control, the next stage of the research work was to utilise the on-site experience to examine at a detailed level the nature of cost behaviour. The analysis of factory costs then showed that certain costs, which were the most significant exhibited a stable relationship with respect to some known variable, usually a specific measure of Output. These costs were then formulated in a cost model, to establish accurate standards in a complex industrial setting in order to provide a meaningful comparison against which to judge actual performance. The necessity of a cost model was •reinforced by the fact that the cost behaviour found to exist was, in the main, a step function, and this complex cost behaviour, the traditional cost and profit planning procedures could not possibly incorporate. Already implemented from this work is the establishment of the post of information officer to co-ordinate data collection and information provision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The activities and function of the West Midlands Adverse Drug Reaction Study Group are described. The impact of the Group on the reporting of adverse drug reactions to the CSM by the yellow card system has been evaluated in several ways including a comparison with the Trent Region. The role of the pharmacist in the Group is highlighted. A nationwide survey of the hospital pharmacist's involvement in adverse drug reaction reporting and monitoring is described, the results are reported and discussed. The available sources of information on adverse drug reactions, both primary and secondary, are critically reviewed. A checklist of necessary details for case reports is developed and examples of problems in the literature are given. The contribution of the drug information pharmacist in answering enquiries and encouraging reporting is examined. A role for the ward pharmacist in identifying, reporting, documenting and following up adverse drug reactions is proposed. Studies conducted to support this role are described and the results discussed. The ward pharmacist's role in preventing adverse drug reactions is also outlined. The reporting of adverse drug reactions in Australia is contrasted with the U.K. and particular attention is drawn to the pharmacist's contribution in the former. The problems in evaluating drug safety are discussed and examples are given where serious reactions have only been recognised after many patients have been exposed. To remedy this situation a case is made for enhancing the CSM yellow card scheme by further devolution of reporting, increasing the involvement of pharmacists and improving arrangements at the CSM. It is proposed that pharmacists should undertake the responsibility for reporting reactions to the CSM in some instances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This industrial based research project was undertaken for British Leyland and arose as a result of poor system efficiency on the Maxi and Marina vehicle body build lines. The major factors in the deterioration of system efficiency were identified as: a) The introduction of a 'Gateline' system of vehicle body build. b) The degeneration of a newly introduced measured daywork payment scheme. By relating the conclusions of past work on payment systems to the situation at Cowley, it was concluded that a combination of poor industrial relations and a lack of managerial control had caused the measured daywork scheme to degenerate into a straightforward payment for time at work. This ellminated the monetary incentive to achieve schedule with the consequence that both inefficiency and operating costs increased. To analyse further the cause of inefficiency, a study of Marina gateline stoppage logs was carried out. This revealed that poor system efficiency on the gateline was caused more by the nature of its design than poor reliability on individual items of' plant. The consideration given to system efficiency at the design stage was found to be negligible, the main obstacles being: a) A lack of understanding pertaining to the influence of certain design factors on the efficiency of a production line. b) The absence of data and techniques to predict system efficiency at the design stage. To remedy this situation, a computer simulation study of' the design factors was carried out from which relationships with system efficiency were established and empirical efficiency equations developed. Sets of tables were compiled from the equations and efficiency data relevant to vehicle body building established from the gateline stoppage logs. Computer simulation, the equations and the tables,when used in conjunction. with good efficiency data, are shown to be accurate methods of predicting production line system.efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes an investigation into a Local Authority's desire to use its airport to aid regional economic growth. Short studies on air freight. the impact of an airport on the local economy, incoming tourism. and the factors influencing airlines in their use of airports, show that this desire is valid. but only in so far as the airport enables air services to be provided. A survey of airlines. conducted to remedy some deficiencies in the documented knowledge on airline decision-making criteria. indicates that there is cause for concern about the methods used to develop air services. A comparison with the West German network suggests that Birmingham is underprovided with international scheduled flights, and reinforces the survey conclusion that an airport authority must become actively involved in the development of air services. Participation in the licence applications of two airlines to use Birmingham Airport confirms the need for involvement but without showing the extent of the influence which an airport authority may exert. The conclusion is reached that in order to fulfill its development potential, an airport must be marketed to both the general public and the air transport industry. There is also a need for a national air services plan.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper shows that many structural remedies in a sample of European merger cases result in market structures which would probably not be cleared by the Competition Authority (CA) if they were the result of merger (rather than remedy).This is explained by the fact that the CA’s objective through remedy is to restore premerger competition, but markets are often highly concentrated even before merger. If so, the CA must often choose between clearing an ‘uncompetitive’merger, or applying an unsatisfactory remedy. Here, the CA appears reluctant to intervene against coordinated effects, if doing so enhances a leader’s dominance.