428 resultados para Remedy
Resumo:
This paper investigates a puzzling feature of social conventions: the fact that they are both arbitrary and normative. We examine how this tension is addressed in sociological accounts of conventional phenomena. Traditional approaches tend to generate either synchronic accounts that fail to consider the arbitrariness of conventions, or diachronic accounts that miss central aspects of their normativity. As a remedy, we propose a processual conception that considers conventions as both the outcome and material cause of much human activity. This conceptualization, which borrows from the économie des conventions as well as critical realism, provides a novel perspective on how conventions are nested and defined, and on how they are established, maintained and challenged.
Resumo:
The recent report of the Milburn Review into Social Mobility highlights the under-representation of young people from lower socio-economic groups in higher education and encourages universities and others to act to remedy this situation as a contribution to greater social mobility. The paper uses data from the Longitudinal Study of Young People in England to examine the relationship between social background, attainment and university participation. The results show that differences in school-level attainment associated with social background are by far the most important explanation for social background differences in university attendance. However, there remains a small proportion of the participation gap that is not accounted for by attainment. It is also the case that early intentions for higher education participation are highly predictive of actual participation. The results suggest that although there may be some scope for universities to act to improve participation by people from less advantaged backgrounds, a much more important focus of action is on improving the school-level achievement of these students.
Resumo:
Recent years have seen an increase in sociolinguistic studies of the Latin language devoted to aspects and forms of politeness as part of general linguistic behaviour, and considerable progress has been made overall. One area, albeit central to the notion of politeness, has been conspicuously neglected so far, and this area is best summarised by the term ‘apologies’. This paper aims to remedy this situation and to provide a first extensive case study, based on the textual corpus of Terence.
Resumo:
In a series of recent cases, courts have reasserted unconscionability as the basis of proprietary estoppel and in doing so have moved away from the structured form of discretion envisaged in the classic Taylors Fashions formula. In light of these developments, this paper traces the use of unconscionability in estoppel and examines the changing role attributed to the concept. In a parallel development, in exercising their remedial discretion once a claim to estoppel has been established, the courts have emphasised the foundation of estoppel in unconscionability to assert the need for proportionality between the detriment and remedy as ‘the most essential requirement’. Collectively, the cases demonstrate a lack of transparency or consistency, which raises concerns that the courts are descending into a form of individualised discretion. These developments are of particular concern as they come at a time when commentators are predicting a ‘boom’ in estoppel to follow the introduction of electronic conveyancing.
Resumo:
This paper considers the utility of the concept of conscience or unconscionable conduct as a contemporary rationale for intervention in two principles applied where a person seeks to renege on an informal agreement relating to land: the principle in Rochefoucauld v Boustead; and transfers 'subject to' rights in favour of a claimant. By analysing the concept in light of our current understanding of the nature of judicial discretion and the use of general principles, it responds to arguments that unconscionability is too general a concept on which to base intervention. In doing so, it considers the nature of the discretion that is actually in issue when the court intervenes through conscience in these principles. However, the paper questions the use of constructive trusts as a response to unconscionability. It argues that there is a need, in limited circumstances, to separate the finding of unconscionability from the imposition of a constructive trust. In these limited circumstances, once unconscionability is found, the courts should have a discretion as to the remedy, modelled on that developed in the context of proprietary estoppel. The message underlying this paper is that many of the concerns expressed about unconscionability that have led to suggestions of alternative rationales for intervention can in fact be addressed whilst retaining an unconscionability analysis. Unconscionability remains a preferable rationale for intervention as it provides a common thread that links apparently separate principles and can assist our understanding of their scope.
Resumo:
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.
Resumo:
Cistus is a plant genus traditionally used in folk medicine as remedy for several microbial disorders and infections. The abundance of Cistus spp. in the Iberian Peninsula together with their ability to renew after wildfire contribute to their profitability as suppliers of functional ingredients. The aim of this study was to provide a comprehensive characterization of the volatile profile of different Cistus plants grown in Spain:Cistus ladanifer L., Cistus albidus L., Cistus salviifolius L., and Cistus clusii Dunal (the latter has not been studied before). A system combining headspace solid-phase microextraction and gas chromatography coupled to mass spectrometry (HS-SPME-GC–MS) was implemented; thereby, the volatile compounds were extracted and analyzed in a fast, reliable and environment-friendly way. A total of 111 volatile compounds were identified, 28 of which were reported in Cistus for the first time. The most abundant components of the samples (mono and sesquiterpenes) have been previously reported as potent antimicrobial agents. Therefore, this work reveals the potential use of the Cistus spp. studied as natural sources of antimicrobial compounds for industrial production of cosmeceuticals, among other applications.
Resumo:
Model-based estimates of future uncertainty are generally based on the in-sample fit of the model, as when Box-Jenkins prediction intervals are calculated. However, this approach will generate biased uncertainty estimates in real time when there are data revisions. A simple remedy is suggested, and used to generate more accurate prediction intervals for 25 macroeconomic variables, in line with the theory. A simulation study based on an empirically-estimated model of data revisions for US output growth is used to investigate small-sample properties.
Resumo:
From the start of the English civil war, the parliamentarians were a fragmented coalition, held together by distrust of the king and a belief that Parliament was entitled to lead action to remedy his government’s deficiencies. The driving motivations of parliamentarians were various, including the religious commitments of puritanism, legalistic thought about the ancient constitution, and more radical notions of republicanism or natural rights. Historians have disputed whether parliamentarianism had an inherent strand of radicalism – or radical potential – from the early 1640s, but radicalization certainly took place as the civil wars went on, alongside more ‘conservative’ reactions against the propaganda and wartime measures employed by parliament. Parliamentarian radicalism itself was varied in character, embracing the Levellers’ populism, parliamentary absolutism, and millenarian and providentialist ideas.
Resumo:
The question is addressed whether using unbalanced updates in ocean-data assimilation schemes for seasonal forecasting systems can result in a relatively poor simulation of zonal currents. An assimilation scheme, where temperature observations are used for updating only the density field, is compared to a scheme where updates of density field and zonal velocities are related by geostrophic balance. This is done for an equatorial linear shallow-water model. It is found that equatorial zonal velocities can be detoriated if velocity is not updated in the assimilation procedure. Adding balanced updates to the zonal velocity is shown to be a simple remedy for the shallow-water model. Next, optimal interpolation (OI) schemes with balanced updates of the zonal velocity are implemented in two ocean general circulation models. First tests indicate a beneficial impact on equatorial upper-ocean zonal currents.
Resumo:
A numerous population of weak line galaxies (WLGs) is often left out of statistical studies on emission-line galaxies (ELGs) due to the absence of an adequate classification scheme, since classical diagnostic diagrams, such as [O iii]/H beta versus [N ii]/H alpha (the BPT diagram), require the measurement of at least four emission lines. This paper aims to remedy this situation by transposing the usual divisory lines between star-forming (SF) galaxies and active galactic nuclei (AGN) hosts and between Seyferts and LINERs to diagrams that are more economical in terms of line quality requirements. By doing this, we rescue from the classification limbo a substantial number of sources and modify the global census of ELGs. More specifically, (1) we use the Sloan Digital Sky Survey Data Release 7 to constitute a suitable sample of 280 000 ELGs, one-third of which are WLGs. (2) Galaxies with strong emission lines are classified using the widely applied criteria of Kewley et al., Kauffmann et al. and Stasinska et al. to distinguish SF galaxies and AGN hosts and Kewley et al. to distinguish Seyferts from LINERs. (3) We transpose these classification schemes to alternative diagrams keeping [N ii]/H alpha as a horizontal axis, but replacing H beta by a stronger line (H alpha or [O ii]), or substituting the ionization-level sensitive [O iii]/H beta ratio with the equivalent width of H alpha (W(H alpha)). Optimized equations for the transposed divisory lines are provided. (4) We show that nothing significant is lost in the translation, but that the new diagrams allow one to classify up to 50 per cent more ELGs. (5) Introducing WLGs in the census of galaxies in the local Universe increases the proportion of metal-rich SF galaxies and especially LINERs. In the course of this analysis, we were led to make the following points. (i) The Kewley et al. BPT line for galaxy classification is generally ill-used. (ii) Replacing [O iii]/H beta by W(H alpha) in the classification introduces a change in the philosophy of the distinction between LINERs and Seyferts, but not in its results. Because the W(H alpha) versus [N ii]/H alpha diagram can be applied to the largest sample of ELGs without loss of discriminating power between Seyferts and LINERs, we recommend its use in further studies. (iii) The dichotomy between Seyferts and LINERs is washed out by WLGs in the BPT plane, but it subsists in other diagnostic diagrams. This suggests that the right wing in the BPT diagram is indeed populated by at least two classes, tentatively identified with bona fide AGN and `retired` galaxies that have stopped forming stars and are ionized by their old stellar populations.
Resumo:
The traditional reduction methods to represent the fusion cross sections of different systems are flawed when attempting to completely eliminate the geometrical aspects, such as the heights and radii of the barriers, and the static effects associated with the excess neutrons or protons in weakly bound nuclei. We remedy this by introducing a new dimensionless universal function, which allows the separation and disentanglement of the static and dynamic aspects of the breakup coupling effects connected with the excess nucleons. Applying this new reduction procedure to fusion data of several weakly bound systems, we find a systematic suppression of complete fusion above the Coulomb barrier and enhancement below it. Different behaviors are found for the total fusion cross sections. They are appreciably suppressed in collisions of neutron-halo nuclei, while they are practically not affected by the breakup coupling in cases of stable weakly bound nuclei. (C) 2009 Elsevier B.V. All rights reserved.
Predictive models for chronic renal disease using decision trees, naïve bayes and case-based methods
Resumo:
Data mining can be used in healthcare industry to “mine” clinical data to discover hidden information for intelligent and affective decision making. Discovery of hidden patterns and relationships often goes intact, yet advanced data mining techniques can be helpful as remedy to this scenario. This thesis mainly deals with Intelligent Prediction of Chronic Renal Disease (IPCRD). Data covers blood, urine test, and external symptoms applied to predict chronic renal disease. Data from the database is initially transformed to Weka (3.6) and Chi-Square method is used for features section. After normalizing data, three classifiers were applied and efficiency of output is evaluated. Mainly, three classifiers are analyzed: Decision Tree, Naïve Bayes, K-Nearest Neighbour algorithm. Results show that each technique has its unique strength in realizing the objectives of the defined mining goals. Efficiency of Decision Tree and KNN was almost same but Naïve Bayes proved a comparative edge over others. Further sensitivity and specificity tests are used as statistical measures to examine the performance of a binary classification. Sensitivity (also called recall rate in some fields) measures the proportion of actual positives which are correctly identified while Specificity measures the proportion of negatives which are correctly identified. CRISP-DM methodology is applied to build the mining models. It consists of six major phases: business understanding, data understanding, data preparation, modeling, evaluation, and deployment.
Resumo:
The problem of semantics is inherent in any discussion of ethics. The general term "ethics" is itself commonly confused. In addition, systems of ethics must be built upon assumptions, and assumptions are necessarily subject to lengthy debate. These two problems are encountered in my investigation of the ethical practices of the modern business community and to remedy the situation I have taken two steps: the first being an attempt to clarify the meaning of terms used therein;-and the second being a clear description of the assumptions utilized to further my analysis. To satisfy those who would disagree with these assumptions, I have attempted to outline the consequences of differing premises. The first assumption in my discussion is that the capitalistic economy is powered by the motivation supplied by man's self-interest. We are conditioned to basing our courses of action upon an orientation toward gratifying this self-interest. Careers are chosen by blending aptitude, interest, and remuneration. of course, some people are less materially inclined than others, but the average member of our capitalistic society is concerned with the physical rewards derived from his employment. Status and happiness are all-important considerations in pursuing a chosen course of action, yet all too often they are measured in physical terms. The normal self-interest natural to mankind is heightened in capitalism, due to the emphasis placed upon material compensation. Our thinking becomes mechanistic as life devolves into a complex game played by the rules. We are accustomed to performing meaningless or unpleasant duties to fulfill our gratifications. Thought, consequently, interferes with the completion of our everyday routines. We learn quickly not to be outspoken, as the outspoken one threatens the security of his fellow man. The majority of the people are quite willing to accept others views on morality, and indeed this is the sensible thing to do as one does not risk his own neck. The unfortunate consequence of this situation has been the substitution of the legal and jural for the moral and ethical. Our actions are guided by legal considerations and nowhere has this been more evident than in the business community. The large legal departments of modern corporations devote full time to inspecting the legality of corporate actions. The business community has become preoccupied with the law, yet this is necessarily so. Complex, modern, capitalistic society demands an elaborate framework of rules and regulations. Without this framework it would be impossible to have an orderly economy, to say nothing of protecting the best interests of the people. However, the inherent complexities, contradictions, and sometimes unfair aspects of our legal system can tempt men to take things into their own hands. From time to time cases arise where men have broken laws while acting in good faith, and other cases where men have been extremely unethical without being illegal. Examples such as these foster the growth of cynicism, and generally create an antagonistic attitude toward the law on the part of business. My second assumption is that the public, on the whole, has adopted an apathetic attitude toward business morality. when faced with an ethical problem, far too many people choose to cynically assume that, if I don't do it someone else will. "The danger of such an assumption lies in that it eliminates many of the inhibitions that normally would preclude unethical action. The preventative factor in contemplating an unethical act not only lies in it going against the "right course of action", but also in that it would display the actor as one of the few, immoral practitioners. However, if the contemplator feels that many other people follow the same course of action, he would not feel himself to be so conspicuous. These two assumptions underly my entire discussion of modern business ethics., and in my judgment are the two most important causal factors in unethical acts perpetrated by the business community. The future elimination of these factors seems improbable, if not futile, yet there is no reason to consider things worse than they ever have been before. The heightened public interest in business morality undoubtedly lies in part in the fact that examples of corporate malpractice are of such magnitude in scope, and hence more newsworthy.
Resumo:
The Internet has taken the world by storm. It has eliminated the barriers of technology, and unlocked the doors to electronic commerce and the 'Virtual Economy'. It has given us a glimpse into the future of 'Business' itself, and it has created a bewildering variety of choices in our personal and professional lives. It has taken on a life of its own, and we are all frantically trying to keep up. Many overwhelmed companies are asking questions like: 'What should our Internet Strategy be?' Or 'How do we put our business on the Internet like everybody else is doing?' or 'How do we use this thing to make money without spending any?'. These questions may seem reasonable on the surface, but they miss the point because they focus on the technologies rather than the core issues of conducting day-to-day business. The Internet can indeed offer fast returns in marketing reach, speed, director consumer sales and so on, and many companies are using it to good advantage, but the highest and best use of any such technology is to support, enhance and even re-invent the fundamentals of general business practice. When the initial excitement is over, and companies gain experience and confidence with the new business models, this larger view will begin to assert itself. Companies will then start to position their 'Internet Strategies' in context of where the business world itself is going over time, and how they can prepare for what is to come. Until now, the business world has been very fragmented, its collective progress limited (in part) by the inability to communicate within and between companies. Now that the technical remedy seems to be at hand and standards are beginning to emerge, we are starting to see a trend toward consolidation, cooperation, and economic synergy. Companies are improving their internal business processes with Intranets, and Electronic Commerce initiatives have sprung up using EDI, the World Wide Web, E-Mail, secure credit card payments and other tools. Companies are using the Internet to talk to each other and to sell their goods and services to the end consumer. Like Berlin, the walls are coming down because they have to. Electronic 'Communities of Common Interest' are beginning to surface, with the goal of supporting and aligning similar industries (such as Government, Insurance, Transportation and Health care) or similar business functions (such as Purchasing, Payments, and Human Resources). As these communities grow and mature, their initial scope will broaden and their spheres of influence will expand. They will begin to overlap into other communities, creating a synergistic effect and reshaping the conduct of business. The business world will undergo a gradual evolution toward globalization, driven by economic imperatives and natural selection in the marketplace, and facilitated by Electronic Commerce and Internet technologies. The business world 'beyond 2000' will have a substantially different look and feel than that which we see today.