891 resultados para Problem of evil
Resumo:
The intensity of long-range correlations observed with the classical HMBC pulse sequence using static optimization of the long-range coupling delay is directly related to the size of the coupling constant and is often set as a compromise. As such, some long-range correlations might appear with a reduced intensity or might even be completely absent from the spectra. After a short introduction, this third manuscript will give a detailed review of some selected HMBC variants dedicated to improve the detection of long-range correlations, such as the ACCORD-HMBC, CIGAR-HMBC, and Broadband HMBC experiments. Practical details about the accordion optimization, which affords a substantial improvement in both the number and intensity of the long-range correlations observed, but introduces a modulation in F1, will be discussed. The incorporation of the so-called constant time variable delay in the CIGAR-HMBC experiment, which can trigger or even completely suppress 1H–1H coupling modulation inherent to the utilization of the accordion principle, will be also discussed. The broadband HMBC scheme, which consists of recording a series of HMBC spectra with different delays set as a function of the long-range heteronuclear coupling constant ranges and transverse relaxation times T2, is also examined.
Resumo:
Hepatocellular carcinoma (HCC) is a cancer with globally rising incidence. Growing evidence supports associations between metabolic syndrome and diabetes as well as obesity and HCC arising in patients with nonalcoholic fatty liver disease (NAFLD). This constitutes a problem of alarming magnitude given the rising epidemic of these conditions. The role of diabetes seems to be particularly important when associated with obesity or cirrhosis. Excess hepatic iron may be another potential risk factor for the development of NAFLD-associated HCC. In the context of NAFLD, HCC frequently develops in a not-yet cirrhotic liver. As there are no surveillance programs for these patients, diagnosis often occurs at a tumor stage beyond curative options. Clinical, tumor, and patient characteristics in NAFLD-associated HCC differ from other etiologies. Older age and cardiovascular comorbidities may limit treatment options further. The outcome in patients with NAFLD-associated early HCC is excellent and therefore aggressive treatment should be pursued in appropriate patients. Population-based prevention to reduce the culprit-NAFLD-early recognition through targeted surveillance programs in risk-stratified patients and effective treatment of HCC associated with NAFLD are urgently needed. In this review, the authors summarize the epidemiology, risk factors, features, and prevention of NAFLD-associated HCC.
Resumo:
I argue that scientific realism, insofar as it is only committed to those scientific posits of which we have causal knowledge, is immune to Kyle Stanford’s argument from unconceived alternatives. This causal strategy (previously introduced, but not worked out in detail, by Anjan Chakravartty) is shown not to repeat the shortcomings of previous realist responses to Stanford’s argument. Furthermore, I show that the notion of causal knowledge underlying it can be made sufficiently precise by means of conceptual tools recently introduced into the debate on scientific realism. Finally, I apply this strategy to the case of Jean Perrin’s experimental work on the atomic hypothesis, disputing Stanford’s claim that the problem of unconceived alternatives invalidates a realist interpretation of this historical episode.
Resumo:
The objective of this paper is to design a path following control system for a car-like mobile robot using classical linear control techniques, so that it adapts on-line to varying conditions during the trajectory following task. The main advantages of the proposed control structure is that well known linear control theory can be applied in calculating the PID controllers to full control requirements, while at the same time it is exible to be applied in non-linear changing conditions of the path following task. For this purpose the Frenet frame kinematic model of the robot is linearised at a varying working point that is calculated as a function of the actual velocity, the path curvature and kinematic parameters of the robot, yielding a transfer function that varies during the trajectory. The proposed controller is formed by a combination of an adaptive PID and a feed-forward controller, which varies accordingly with the working conditions and compensates the non-linearity of the system. The good features and exibility of the proposed control structure have been demonstrated through realistic simulations that include both kinematics and dynamics of the car-like robot.
Resumo:
In university studies, it is not unusual for students to drop some of the subjects they have enrolled in for the academic year. They start by not attending lectures, sometimes due to neglect or carelessness, or because they find the subject too difficult, this means that they lose the continuity in the topics that the professor follows. If they try to attend again they discover that they hardly understand anything and become discouraged and so decide to give up attending lectures and study on their own. However some fail to turn up to do their final exams and the failure rate of those who actually do the exams is high. The problem is that this is not only the case with one specific subject, but it is often the same with many subjects. The result is that students arent’s productive enough, wasting time and also prolonging their years of study which entails a great cost for families. Degree courses structured to be conducted and completed in three academic courses, it may in fact take up to an average of six or more academic courses. In this paper, we have studied this problem, which apart from the waste of money and time, produces frustration in the student, who finds that he has not been able to achieve what he had proposed at the beginning of the course. It is quite common, to find students who do not even pass nor 50% of the subjects they had enrolled in for the academic year. If this happens repeatedly to a student, it can be the point when he considers dropping out altogether. This is also a concern for the universities, especially in the early courses. In our experience as professors, we have found that students, who attend lectures regularly and follow the explanations, approach the final exams with confidence and rarely fail the subject. In this proposal we present some techniques and methods carried out to solve in possible, the problem of lack of attendance to lectures. This involves "rewarding students for their assistance and participation in lectures". Rewarding assistance with a "prize" that counts for the final mark on the subject and involving more participation in the development of lectures. We believe that we have to teach students to use the lectures as part of their learning in a non-passive way. We consider the professor's work as fundamental in terms of how to convey the usefulness of these topics explained and the applications that they will have for their professional life in the future. In this way the student see for himself the use and importance of what he is learning. When his participation is required, he will feel more involved and confident participating in the educational system. Finally we present statistical results of studies carried out on different degrees and on different subjects over two consecutive years. In the first year we assessed only the final exams without considering the students attendance, or participation. In the second year, we have applied the techniques and methods proposed here. In addition we have compared the two ways of assessing subjects.
Resumo:
The problem of parameterizing approximately algebraic curves and surfaces is an active research field, with many implications in practical applications. The problem can be treated locally or globally. We formally state the problem, in its global version for the case of algebraic curves (planar or spatial), and we report on some algorithms approaching it, as well as on the associated error distance analysis.
Resumo:
This paper is a critical examination of Alfred North Whitehead's attempt to solve the traditional problem of evil. Whitehead's conception of evil is crucial to his process cosmology because it is integral to his process cosmology because it is integral to his notion of creation in which evil is understood in relationship to the larger dynamic of God’s creative activity. While Whitehead’s process theodicy is interesting, he fails to successfully escape between the horns of the traditional dilemma. Whitehead is often criticized for treating evil as merely apparent. While some process philosophers, notably Maurice Barineau, have defended Whitehead from this charge, it can be shown that this is an implication of Whitehead’s approach. Moreover, Whitehead’s theodicy fails to address radical moral evil in its concrete dimension in respect to real human suffering. As a result, Whitehead’s theodicy is not relevant to Christian theology. My paper is divided into two parts. I will first briefly discuss the traditional problem of evil and some of the traditional problem of evil and some of the traditional solutions proposed to resolve it. The reminder of the paper will demonstrate why Whitehead’s theodicy addresses the traditional problem of evil only at the expense of theological irrelevancy.
Resumo:
In this paper I want to develop a particular kind of greater-good response to the problems of evil and hell, one which hence can serve as a backup plan should the free will defense not satisfy. Ultimately, this response will appear to belong to several traditions in theodicy. Like all greater-goods views, this one relies on explaining the existence of evil in terms of the greater goods that come out of it. Among these goods are the greater goods of Incarnation and Atonement, their respective goodness consisting in large part in the higher-order divine good of glorifying God through the display of divine virtue.
Resumo:
A new mathematical model is proposed for the spreading of a liquid film on a solid surface. The model is based on the standard lubrication approximation for gently sloping films (with the no-slip condition for the fluid at the solid surface) in the major part of the film where it is not too thin. In the remaining and relatively small regions near the contact lines it is assumed that the so-called autonomy principle holds—i.e., given the material components, the external conditions, and the velocity of the contact lines along the surface, the behavior of the fluid is identical for all films. The resulting mathematical model is formulated as a free boundary problem for the classical fourth-order equation for the film thickness. A class of self-similar solutions to this free boundary problem is considered.
Resumo:
For each pair (n, k) with 1 ≤ k < n, we construct a tight frame (ρλ : λ ∈ Λ) for L2 (Rn), which we call a frame of k-plane ridgelets. The intent is to efficiently represent functions that are smooth away from singularities along k-planes in Rn. We also develop tools to help decide whether k-plane ridgelets provide the desired efficient representation. We first construct a wavelet-like tight frame on the X-ray bundle χn,k—the fiber bundle having the Grassman manifold Gn,k of k-planes in Rn for base space, and for fibers the orthocomplements of those planes. This wavelet-like tight frame is the pushout to χn,k, via the smooth local coordinates of Gn,k, of an orthonormal basis of tensor Meyer wavelets on Euclidean space Rk(n−k) × Rn−k. We then use the X-ray isometry [Solmon, D. C. (1976) J. Math. Anal. Appl. 56, 61–83] to map this tight frame isometrically to a tight frame for L2(Rn)—the k-plane ridgelets. This construction makes analysis of a function f ∈ L2(Rn) by k-plane ridgelets identical to the analysis of the k-plane X-ray transform of f by an appropriate wavelet-like system for χn,k. As wavelets are typically effective at representing point singularities, it may be expected that these new systems will be effective at representing objects whose k-plane X-ray transform has a point singularity. Objects with discontinuities across hyperplanes are of this form, for k = n − 1.
Resumo:
Competing hypotheses seek to explain the evolution of oxygenic and anoxygenic processes of photosynthesis. Since chlorophyll is less reduced and precedes bacteriochlorophyll on the modern biosynthetic pathway, it has been proposed that chlorophyll preceded bacteriochlorophyll in its evolution. However, recent analyses of nucleotide sequences that encode chlorophyll and bacteriochlorophyll biosynthetic enzymes appear to provide support for an alternative hypothesis. This is that the evolution of bacteriochlorophyll occurred earlier than the evolution of chlorophyll. Here we demonstrate that the presence of invariant sites in sequence datasets leads to inconsistency in tree building (including maximum-likelihood methods). Homologous sequences with different biological functions often share invariant sites at the same nucleotide positions. However, different constraints can also result in additional invariant sites unique to the genes, which have specific and different biological functions. Consequently, the distribution of these sites can be uneven between the different types of homologous genes. The presence of invariant sites, shared by related biosynthetic genes as well as those unique to only some of these genes, has misled the recent evolutionary analysis of oxygenic and anoxygenic photosynthetic pigments. We evaluate an alternative scheme for the evolution of chlorophyll and bacteriochlorophyll.
Resumo:
The phenomenon of desensitization is universal, but its mechanism is still ill-understood and controversial. A recently published study [Lin, F. & Stevens, C. F. (1994) J. Neurosci, 14, 2153-2160] attempted to cast light on the mechanism of desensitization of N-methyl-D-aspartate (NMDA) receptors, in particular the vexed question of whether the channel must open before it can desensitize. During the desensitizing preexposure to agonist in those experiments, more desensitization was produced when channel openings were observed than when no openings were observed. The conclusion that "desensitization occurs more rapidly from the open state" unfortunately was based on a stochastic fallacy, and we present here a theoretical treatment and illustration showing that the observed behavior is predicted by a simple mechanism in which desensitization can occur only from a shut state.